Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Biomedical Engineering / Biomedizinische Technik

Joint Journal of the German Society for Biomedical Engineering in VDE and the Austrian and Swiss Societies for Biomedical Engineering and the German Society of Biomaterials

Editor-in-Chief: Dössel, Olaf

Editorial Board: Augat, Peter / Habibović, Pamela / Haueisen, Jens / Jahnen-Dechent, Wilhelm / Jockenhoevel, Stefan / Knaup-Gregori, Petra / Lenarz, Thomas / Leonhardt, Steffen / Plank, Gernot / Radermacher, Klaus M. / Schkommodau, Erik / Stieglitz, Thomas / Boenick, Ulrich / Jaramaz, Branislav / Kraft, Marc / Lenthe, Harry / Lo, Benny / Mainardi, Luca / Micera, Silvestro / Penzel, Thomas / Robitzki, Andrea A. / Schaeffter, Tobias / Snedeker, Jess G. / Sörnmo, Leif / Sugano, Nobuhiko / Werner, Jürgen /

6 Issues per year

IMPACT FACTOR 2017: 1.096
5-year IMPACT FACTOR: 1.492

CiteScore 2017: 0.48

SCImago Journal Rank (SJR) 2017: 0.202
Source Normalized Impact per Paper (SNIP) 2017: 0.356

See all formats and pricing
More options …
Volume 63, Issue 5


Volume 57 (2012)

Cardiovascular assessment by imaging photoplethysmography – a review

Sebastian Zaunseder
  • Corresponding author
  • TU Dresden, Institute of Biomedical Engineering, Helmholtzstraße 18, Dresden, 01069 Saxony, Germany
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Alexander Trumpp
  • TU Dresden, Institute of Biomedical Engineering, Helmholtzstraße 18, Dresden, 01069 Saxony, Germany
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Daniel Wedekind
  • TU Dresden, Institute of Biomedical Engineering, Helmholtzstraße 18, Dresden, 01069 Saxony, Germany
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Hagen Malberg
  • TU Dresden, Institute of Biomedical Engineering, Helmholtzstraße 18, Dresden, 01069 Saxony, Germany
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
Published Online: 2018-06-13 | DOI: https://doi.org/10.1515/bmt-2017-0119


Over the last few years, the contactless acquisition of cardiovascular parameters using cameras has gained immense attention. The technique provides an optical means to acquire cardiovascular information in a very convenient way. This review provides an overview on the technique’s background and current realizations. Besides giving detailed information on the most widespread application of the technique, namely the contactless acquisition of heart rate, we outline further concepts and we critically discuss the current state.

Keywords: camera; cardiovascular; heart rate; imaging photoplethysmography; oximetry; perfusion; remote photoplethysmography


Over the last few years, the contactless acquisition of cardiovascular parameters using cameras has gained immense attention. Similar to the clinically used photoplethysmography (PPG), this technique referred to as camera-based PPG, imaging PPG (iPPG) or remote PPG exploits variations in light modulation due to the cardiovascular activity. This work reviews the current state of the technique. We aim to give a comprehensive overview of the background and current realizations related to iPPG together with a critical appraisal of the current state. Our work complements earlier reviews on non-contact [1], [2] or camera-based [3], [4], [5], [6], [7] cardiovascular assessment by the integration of current physiological understanding and expanded view on novel developments, even beyond heart rate (HR) assessment, in the extremely dynamic field of iPPG. It should be noted that this review specifically focuses on the assessment of cardiovascular parameters. For the assessment of respiration, which is feasible by using cameras from body movements, amplitude and baseline variations of extracted PPG signals or even from the heartbeat intervals (respiratory sinus arrhythmia), we refer readers to [8], [9], [10], [11], [12], [13], [14].

The remainder is organized as follows. In the section “Physiological background”, we provide a unifying theory explaining the origin of iPPG signals that is based on recent works directed at principle mechanisms. As, on the one hand, most works in iPPG focus on the HR and heart rate variability (HRV) 1 and, on the other hand, extraction of HR often is the first step to further analyses, this review first concentrates on such applications. All details concerning HR and HRV are given in the section “Camera-based assessment of HR and HRV”. In the section “Physiological measures beyond HR”, we review further opportunities to yield cardiovascular parameters by iPPG and finally discuss the current state in the section “Discussion”.

Physiological background

PPG’s background: PPG measures variations in the intensity of transmitted or reflected light. In conventional PPG, the blood volume, blood vessel wall movement and the orientation of red blood cells affect the amount of light at the detector and thus the photoplethysmographic signal [15]. In that context, the arterial vasculature contributes the most to the signal’s pulsation component [16]. In general, PPG can be operated in a transmissive or a reflection mode where the first one is restricted to certain areas (ear lobe, fingertip) [15].

Imaging PPG’s background: iPPG also captures light intensity variations by an optical sensor. However, the basic mechanisms differ (light penetration depth is expected to be lower [17], [18] in the remote setting) and other factors, most notable movements of the measurement area relative to the sensor, have to be taken into consideration. It is nowadays widely accepted that in iPPG (1) blood volume effects and (2) ballistocardiographic effects contribute to the pulsating character of the signal.

Blood volume effects denote light modulations due to the varying amount of blood in the measurement volume. Two theories have been proposed. First, a direct measurement of the periodically changing vessels’ cross-sections and an associated blood volume change were assumed [19]. This theory is based on the conventional PPG theory. Second, based on the assumption that visible light will not penetrate down to pulsating arteries, Kamshilin et al. and Sidorov et al. proposed an alternative theory [20], [21]. They assumed that the oscillating transmural pressure in larger arteries causes a cyclic deformation of the connective tissue in the dermis. As a result, the capillary density in the papillary dermis varies, which also influences the blood volume within the measurement volume. Both theories assume blood volume changes (actually the amount of hemoglobin in the measurement volume) to be responsible for the measurement signal. Light is thus required to penetrate into the skin. However, the depth of penetration differs in both theories as according to Kamshilin et al. the light is not expected to interact with deeper vessels.

Ballistocardiographic effects denote the pulsating component due to movements and are known to occur in iPPG recordings [22], [23]. In this regard, global ballistocardiographic effects and local effects should be distinguished. Global ballistocardiographic effects denote movements of the measurement area due to distant mechanisms, e.g. movements of the head due to blood ejection to the aorta. Local ballistocardiographic effects denote movements, which are caused by local mechanisms, i.e. tilting due to a larger artery underneath the measurement area. Both global and local effects are superficial effects, i.e. they do not require light to penetrate into the skin but will, due to the movement of the body, lead to amplitude modulations if the region of interest (ROI) is kept statically.

Quantification of effects and relevance: It is difficult to argue for either one of the theories that explain the blood volume effect. The eminent contribution of Marcinkevics et al. [24] proves a behavior that matches the conventional PPG theory, namely different behavior in green and infrared wavelength according to the vessel types and depths that are stimulated. In fact, based on the penetration depth and the anatomy of the skin, it can be assumed that visible light reaches pulsating vessels (i.e. arterioles) [16], [25], [26]. In the future, extended investigations as done by Marcinkevics et al. and more complex simulations, e.g. as in [27], might contribute to a deepened understanding of the dominant effects. Similarly, a quantification of ballistocardiographic effects is difficult. Systematic studies on iPPG signals’ origin revealed that ballistocardiographic effects occur mainly if inhomogeneous and non-orthogonal illumination (incoming light not perpendicular to the skin surface) is applied [22], [28]. To what extent global or local effects impact the signal is highly dependent on the measurement location and hard to quantify.

However, as stated earlier, both theories which explain blood volume effects result in an increased amount of hemoglobin in the measurement volume, making them, to some extent, comparable. Ballistocardiographic effects, in turn, differ substantially from blood volume effects. They also produce a pulsating signal behavior but the phase of the resulting signal as well as its morphology can differ substantially compared to the signals, which stem from blood volume effects. Averaging areas which hold different effects can be destructive. Moreover, some of the applied algorithms and assessed variables only apply to blood volume effects (e.g. if oxygen saturation is assessed, see section “Oximetry”). Against that background, a careful selection of suitable areas should be fostered.

One should be aware that, as far as the HR and HRV are concerned, obviously even ballistocardiographic effects carry usable information. However, to yield a ballistocardiogram, it is much more common to exploit movements by tracking feature than exploiting intensity variations [29], [30], [31]. Tracking features can yield HR/HRV as well as respiration, but this approach is outside the scope of this review.

Camera-based assessment of heart rate and heart rate variability

The following section provides a comprehensive overview on the most important aspects related to the assessment of HR and HRV by iPPG. Table 1 and Figure 1 summarize those aspects.

Table 1:

Aspects related to HR assessment by imaging PPG and the corresponding sections in this work.

Schematic illustration of the basic procedure to derive the HR by iPPG. In the example, an RGB video and a rectangular bounding box are used to derive three time varying signals. The transformation from RGB to the Lab color space decisively strengthens the pulsation in the L and a channel, visible after applying a bandpass filter (cut-off frequencies 0.5 and 5 Hz). In the Lab color space, the heart rate (HR) can be easily extracted after applying the Fourier transform.
Figure 1:

Schematic illustration of the basic procedure to derive the HR by iPPG.

In the example, an RGB video and a rectangular bounding box are used to derive three time varying signals. The transformation from RGB to the Lab color space decisively strengthens the pulsation in the L and a channel, visible after applying a bandpass filter (cut-off frequencies 0.5 and 5 Hz). In the Lab color space, the heart rate (HR) can be easily extracted after applying the Fourier transform.

Considered populations and experimental protocols

Experimental protocols: Most available works focus on algorithmic developments. Recordings typically were made under laboratory conditions and were carried out at rest or, if motion robustness was addressed, during predefined movements (e.g. [32], [33], [34], [35]). Some works directed at certain applications carried out specific tests, e.g. driving studies [36], [37] or the use of fitness devices [38], [39].

Considered populations: The distinct majority of works considered healthy subjects only. Investigation invoking patients, i.e. subjects with (cardiovascular) diseases, has become more popular as the technique is attracting more attention. For example, Rasche et al. [40] and Couderc et al. [41] applied the technique in patients after heart surgery and during atrial fibrillation, respectively, showing the principle applicability. Even Amelard et al. recently showed that arrhythmia can be detected by iPPG in a wavelet-based time-frequency representation [42]. Furthermore, studies in neonatal intensive care units gain interest due to the controlled conditions [43], [44], [45], [46]. Other clinical investigations address intraoperative recordings [47], dialysis patients [11], [48] and migraine patients [49].

Evaluation: The populations and experimental protocols are currently one major limitation of iPPG. Based on previous studies or the working principle of iPPG, three factors are likely to affect the performance of iPPG considerably: pathology/age-dependent factors, skin tone and non-stationary conditions (motion, illumination variations).

As stated earlier, there are works that examined pathological subjects. The actual influence of the respective illnesses on the applicability of iPPG, however, was rarely addressed. It is consequently difficult to draw any generally valid conclusions. The factor of age was hardly assessed as usually exclusively younger subjects are considered. To the best of our knowledge, only our group analyzed a larger set of individuals where the average age exceeded 70 years [40], [50]. In this case, however, healthy elderly and younger subjects were not included so that a proper quantification of pathology and age is not possible. To reliably quantify the effects of pathology and age, larger populations of mixed age and health state are needed.

A couple of works considered the impact of skin pigmentation [51], [52], [53], [54]. Those works show the tendency of higher values in the Fitzpatrick scale to degrade the results but proper methods might be able to compensate for the skin tone. For example, Wang et al. [54] proposed such a method. However, the analysis in [54] concerning skin tone is based on 15 subjects. To quantitatively estimate the effect of skin tone and to develop generalized applicable methods to compensate for it, larger populations must be considered.

Many works show the dramatic effect of non-stationary conditions on the performance of processing methods [e.g. a drop of signal-to-noise ratio (SNR) from over 10 dB to approximately 5 dB [33], a reduction of more than 50% in the correlation of iPPG HR and reference HR [55], etc.]. Notably, in many cases, algorithms, which were previously shown to produce stable results even under non-stationary conditions, do not yield satisfactory results when applied to other data. The most likely explanation is again the used data which is not always representative.

As can be seen, a proper quantification of the impact of all factors lacks from restricted data. In fact, most often only few subjects (the typical number is below 20) were invoked. Even worse, the data are hardly comparable and, thus, neither is the performance of the proposed methods. Fortunately, publicly available data has become more popular, e.g. data from [35], [56], [57], the MAHNOB-HCI database [58], the UBFC-RPPG database [59] and the DEAP dataset [60] are available (on request) or their release has been announced. 2

Future methodological developments should consequently use this data (at least together with own data) to allow more meaningful comparisons and an objective assessment of proposed methods. Besides, in situ studies featuring the application of iPPG under real-world conditions and applied to a representative sample of the respective target group are badly needed. Such data are missing today (apart from some car driving studies). Obviously, the effort is much badly than for laboratory studies but only in this way statements beyond principal feasibility, namely the ability to sufficiently generalize and even more important an added (clinical) value by iPPG, can be shown.


Color channels: According to data from the cameras used the vast majority of works rely on information from red, green and blue (RGB). As shown later, static and dynamic methods allow to combine color channels. The green channel turned out to be the one with the highest SNR [61]. Therefore, some works use only this channel from RGB or even apply monochromatic cameras [49], [62], [63] with green color filters (such cameras typically provide a higher SNR). In an attempt to figure out beneficial color combinations, McDuff et al. used a five-band camera and showed the best combination to be cyan, green and orange [10]. Even near-infrared (NIR) cameras have been used [42], [64], [65], [66], [67]. NIR systems operate without visible illumination, which is advantageous for applications like driver monitoring and sleep studies [2]. The drawback of using NIR is the lower absorption by hemoglobin [25] and a resulting low SNR.

Camera technique: Initially industrial cameras and today, due to their high availability, more frequently low-cost cameras, i.e. consumer electronics like web-cams, are used. The color depth is typically 8 bit per color channel. Higher color depths of 14, 12 and 10 bit, for example, found application in [65], [66] and [23], [40] as well as [68], respectively. Applied resolutions vary greatly, ranging from 1920×1080, e.g. in [36], [52], down to 300×300 pixels [43] and 320×240 pixels [69], respectively. The most common resolution is 640×480 pixels. The sampling frequencies are typically at 30 fps or below. For specific purposes, higher sampling frequencies up to 420 fps have been evaluated [70], [71], [72], [73], [74]. In general, higher sampling frequencies are assumed to better resolve temporal characteristics. Particularly concerning HRV, for applications which try to access the morphology of the signal or for spatio-temporal applications (see also Section “Physiological measures beyond HR”), a higher temporal resolution is advantageous. Most researchers make use of a single camera. However, for example, Estepp et al. [71] and Blackford et al. [75] showed in a nine and three camera setup, respectively, the advantage of multiple cameras. Particularly in the case of motion, multiple cameras might be able to compensate for the resulting variations. Videos are typically stored in an uncompressed format. McDuff et al. recently showed that physiological information can be extracted despite compression though the quality is degraded [57].

Polarization: The usage of polarization, i.e. polarized illumination and polarization filters, can help to reduce artifacts, particularly superficial reflections. Some works make use of polarization filtration [24], [76], [77] but past statements on the effectiveness of polarization differ (Hülsbusch [17] stated no improvement whereas Sidorov et al. [63] found an improved reliability regarding the application of orthogonal polarization). A recent work, however, demonstrated that orthogonal polarization filtration yields higher signal qualities, helps to separate blood volume effects and ballistocardiographic effects and generally increases the understanding about iPPG signal’s origin [28]. Such findings strongly suggest to use the filter technique. It requires, however, a much more complex setup including polarized illumination which is problematic for many applications.

Illumination: Hülsbusch [17] and Moco et al. [22] showed that a homogeneous illumination should be applied and some authors optimize illumination, e.g. Guazzi et al. [78] used spatial light sources and reflecting materials yielding a diffuse and homogeneous illumination. Amelard et al. used a temporally coded illumination sequence in order to compensate and measure active and ambient illumination [70]. However, most often common ambient illumination, artificial light, natural light or their combination are used.

Evaluation: The used hardware reflects the availability and ease of use of consumer cameras with integrated optics and off-the-shelf illumination. In many situations, particularly to prove algorithmic concepts, simple setups are sufficient. However, concerning real-life applications and considering deepened knowledge on the technique, e.g. concerning beneficial color channels or even the benefit of polarization, more specific systems might be applied in the future. To define minimal requirements or optimal values, e.g. on the resolution or illumination, regarding the used hardware is difficult. Although some investigations try to do so (e.g. [72], [79], [80]), the complex interdependence of measurement distance, illumination and used camera technique raises doubts on the significance of the said investigations.

Image processing

ROI definition: Most of the proposed works make use of the face or parts of it as the ROI. The face is typically not covered and well perfused providing an ideal measurement location. Face detection, thus, is an essential step in iPPG. The Viola-Jones face detector is the most common choice for this task [81]. Differences exist regarding the facial parts to be used, i.e. using the whole face or only parts of it. Works investigating the spatial distribution of the signal quality showed the forehead and cheeks to provide promising results [23], [56], [68], [82]. Even the area around the mouth/lips was reported to yield good results [30]. A definition of such regions based on not only the facial features (e.g. [35], [83], [84], [85], [86]) but also the usage of the whole face, e.g. defined by a rectangular bounding box or a predefined percentage of it, is common (e.g. [35], [87], [88], [89], [90]). More complex models have been used to segment the face, i.e. facial landmark localization. For example, Bousefsaf et al. [32] used the method proposed in [91], Stricker et al. [35] used the deformable model fitting by regularized landmark mean-shift by Saragih et al. [92], and McDuff et al. [93] used the method proposed in [94].

Besides segmenting regions based on anatomical characteristics, some works use non-anatomical image information, e.g. gradients, to refine the ROI from a previously identified facial area [42], [51], [95]. Also, the use of color information to refine a previously defined ROI (e.g. by applying thresholds [8], [32], [96], [97] or using GrabCut [35]) or a refinement by exploiting the temporal pulsating behavior of usable pixels/regions was proposed [34], [59], [98], [99], [100]. Approaches exploiting local color characteristics and homogeneity can also be applied without explicit face detection [50].

Using other regions than the face is not common. However, for example, the use of the lower leg [65], [101], the palm [9], [20], [72], [101], [102], [103], [104] and the forearm [73], [103], [105], [106] proved the principle feasibility, although signals typically are of minor quality (see, e.g. [103], [104] where direct comparisons are contained).

ROI tracking: To track the ROI most commonly the Kanade-Lucas-Tomasi [107], [108] algorithm is applied (e.g. in [56], [84], [109], [110]). Other approaches to track the ROI include more complex models, e.g. deformable model fitting by regularized landmark mean-shift by Saragih et al. [92] used in [35]. Wang et al. [100] combined a global tracking-by-detection method [111] with a local tracking approach using optical flow by Farneback [112]. Obviously, tracking is required when subjects move in order to keep the ROI in the desired location. However, tracking also might introduce some jitter in the ROI, which is likely to impair the results at rest [23], [110]. Moreover, stronger movements will likely change the illumination condition, i.e. the brightness of the ROI. In such cases, even perfect tracking will not suffice the needs, and additional processing steps are necessary to avoid artifacts. Trumpp et al. [50], for example, only tracked areas with a similar intensity distribution. Another possible solution is the overlap-add operation, i.e. a weighted summation of signals to avoid artifacts due to changing ROIs [34]. Amelard et al. [42] applied a Bayesian framework to track the pulsatile regions of the video.

Evaluation: Both ROI selection and tracking are crucial aspects for iPPG. As the selection is the precondition for any successful extraction, it is even more distinctive. According to current findings, the origin of iPPG, i.e. the existence of blood volume effects and ballistocardiographic effects, accounts for using homogeneous regions with respect to the measured effect. This favors approaches that do a refinement of an anatomically motivated ROI (like cheeks and forehead). However, under defined laboratory conditions, the signal quality is likely to be sufficient without suchlike refinements, but particularly in real world scenarios a careful refinement might become crucial (see, e.g. [32], [68], [109] for a comparison of ROIs).

Color channels and their combination

Basic idea: The usage of a single color channel, most often the green channel, which typically yields the highest signal quality [9], [61], as well as a combination of color channels are common. For the combination, there exist two approaches. On the one hand, channels can be combined using a priori knowledge. Such approaches are based on the assumption that the pulsation, as well as artifacts, are differently pronounced in channels of different color spaces or projection spaces (see section “Color spaces – knowledge-based channel combination”). On the other hand, blind source separation (BSS) can be used to combine color channels. BSS pursues a data-driven combination of color channels, i.e. yields a dynamic combination of color channels, and is described in the section “Source separation – data-driven channel combination”.

Color spaces – knowledge-based channel combination

The transformation of the RGB color space into other representations is intended to better separate photoplethysmographic effects from distortions. In this regard, most importantly, chrominance-based approaches have to be mentioned. For example, de Haan and Jeanne [53] introduced different empirically reasoned weighted combinations of R, G, and B channels, which yield a chrominance signal and are widely used [97]. The group later presented a similar approach to derive an iPPG signal which provides improved motion robustness [39]. Based on physiological characteristics and the filter properties of the used camera model, they defined a blood volume signature which was eventually exploited to weight the input R, G, B signals. Bousefsaf et al. compared different combinations [32] and used the u* channel in the International Commission on Illumination (CIE) L*u*v* color space [8], [113] as the input signal, while Yang et al. used chromaticity from the CIE L*a*b* color space, i.e. a* and b* where a* outperformed b* [114]. Ruminski [115] showed that YUV channels, particularly the V channel, outperformed RGB (alone or used with source separation techniques). Lueangwattana et al., in turn, used the hue, saturation, and value (HSV) space and, in particular, the hue channel [116]. In a comparison to color channels, Tsouri and Li [117] also found hue to be the best choice while U and Y from CIE YUV and CIE XYZ provided a similar accuracy. Stricker et al. [35] used a simple normalization of the form GR+B+G to yield the signal for HR extraction. Xu et al. [118] proposed to use a signal defined by a logarithmic quotient of color channels. At each time instant t, the resulting pixel value is calculated by x(t)=logG(t+1)R(t)R(t+1)G(t), where R(t) and G(t) denote the red and green value, respectively, at time t. Wang et al. did not rely on a predefined color space but proposed a transformation which is based on a simplified model of skin-light interaction. The method entitled Plane-Orthogonal-to-Skin combines normalized RGB channels into two novel channels which are fused by weighting to the desired signal [54]. A data-driven extension adaptively estimates the aforementioned plane given a prior skin-pixel detection and assesses the plane rotation for pulse extraction [119].

Evaluation: Comparative works prove the potential of color transformations. Although source separation techniques can provide equivalent information, they suffer from permutation indeterminacy [120] and are not always effective (see section “Source separation – data-driven channel combination”). A fusion of static and data-driven color channel combinations has been proposed [39]. Against that background, the combination based on a priori knowledge is a very reasonable choice and can be expected to gain importance. Chrominance is most widely used and has proven to be a good choice; other less popular realizations should be comparatively validated.

Source separation – data-driven channel combination

Applied strategies and algorithms: Since its first use by Poh et al. [89] BSS has become a core part of signal processing in iPPG. BSS aims at separating the desired signal content (i.e. cardiac pulse) from noise and artifacts. Principal component analysis (PCA) and independent component analysis (ICA) [121] have been used. Standard realizations such as JADE [89] or FastICA [85], and extensions like joint BSS [37], spatio-temporal ICA [38], constrained ICA [122], radical ICA [87], robust ICA [123] and zero phase component analysis (ZCA) [84] have been applied to iPPG. Most commonly, BSS applies in a multispectral setting, i.e. different color channels from a single ROI serve as an input to BSS. Typically, RGB channels extracted from the whole face as ROI are used [39], [53], [85], [87], [89], [90], [124]. Even alternative color channels (orange, cyan, NIR) and color spaces (chrominance and hue) [39], [88], [93], [100], [116], [125], as well as more selective ROI choices, i.e. not using the whole face to exclude regions that are not supposed to contribute with useful signal, have been studied [10], [38], [84], [85], [87], [123], [126]. Recently, application of ICA using variable time length was proposed [86].

An alternative to the aforementioned multispectral BSS application is spatial BSS. A monochrome iPPG, extracted from the spatially separated ROIs, was used as an input for ICA [120], [127]. The spatio-temporal extension of a single ROI monochrome iPPG has been proposed [38]. Wang et al. identified PCA inputs without explicit ROI detection but by exploiting the temporal behavior of pixel traces to identify suitable regions [100]. Even Lam and Kuno made use of spatially separated regions, the so-called patches which are subregions from a previously defined larger ROI [128]. By always choosing two patches randomly as the input to ICA, they generate multiple HR estimates from which the true HR is estimated.

Finally, the combination of spatial and multispectral BSS is also found. The approach in [14], [129] additionally provides signals from a corrupted area in order to make BSS more stable. McDuff et al. combined multispectral recordings from multiple cameras also yielding a kind of multispectral and spatial combination [33].

Evaluation: Despite its frequent use, there is neither consensus on the benefit of BSS application in general, nor regarding the setting in which BSS should be applied. For example, Kwon et al. [124] and Feng et al. [126] reported an increased HR error and a lack of robustness, respectively, while Christinaki et al. reported only subtle improvements when using multispectral input [123]. Wedekind et al. [127] shed some more light on the topic by comparing multispectral and spatial inputs for PCA and ICA. They showed the spatial input to be advantageous and the outcome of BSS to be strongly dependent on the input quality, possibly BSS even degrading the signal quality. Such investigations strongly suggest to apply BSS only conditionally and to do a preselection of inputs. Moreover, the problem of permutation indeterminacy [120] deserves more attention in order to raise the practical value of BSS techniques.

Signal processing

Applied methods: Detrending and/or bandpass filtering are commonly applied (e.g. [84], [90], [93]). Bandpass filters do at least cover a predefined range of expected HRs (e.g. 0.7–3.0 Hz [130], 0.75–4.5 Hz [93], 0.7–4 Hz [90]). Sometimes higher frequency contents that occur within the signal course are considered (e.g. from 0.1 Hz to 8 Hz [84] or 0.4 Hz to 10 Hz [99]). Besides conventional frequency selective filters, Wang et al. proposed the use of a simple amplitude selective filter [131]. The filter exploits the expectation on physiologically reasonable color variation due to the perfusion.

Some works propose specific signal processing techniques. For example, Bousefsaf et al. [8] and Wu et al. [132] used the continuous wavelet transform (CWT) to denoise the signal by filtering and weighting, respectively, wavelet coefficients and inverse transform. Huang et al. [97] used the CWT to identify the most suited scale and applied an inverse transform using that scale, which effectively yields an adaptive bandpass. Feng et al. [126] also applied an adaptive bandpass by considering the most dominant peaks in the Fourier domain, whereas Sun et al. [133] made use of an empirical mode decomposition. Jiang et al. [134] utilized a Kalman filter to filter the signal obtained from the green channel.

Evaluation: Notably, signal processing is applied before, after or even before and after channel combination. Most often, conventional bandpass filters are employed. The passband is typically limited to expected HRs, i.e. below 4 Hz. This approach is well suited to yield the mean HR. However, all approaches which foster beat detection and HRV analysis should carefully define the passband because beat detection and HRV analysis suffer from the loss of morphology and temporal information by filtering [135].

Heart rate extraction and analysis

Methods to extract the HR: To extract the HR, most often windows of predefined length are considered (10 s are widely used, and even longer intervals like 18 s [51], 20 s [14] and 60 s [56] have been described). Using windows allows the extraction in the frequency domain after applying the Fourier transform (e.g. [35], [39], [40], [51], [89]) or autoregressive models [11], [136]. The extraction in the time domain by autocorrelation (e.g. [42], [87], [115]) is another approach. Sliding window approaches yield a time series of mean HRs. The length of the windows used constitutes a trade-off between opposing facts. The desired temporal resolution and the nonstationary nature of HR account for short windows, whereas the attenuation of other effects than HR profits from longer windows.

Alternatively, single beats are detected yielding a beat-to-beat time series. Similar attempts aim at providing the basis for HRV analysis. 3 The technique typically requires filtering (bandpass filters in a predefined range of expected HRs) and applies maximum detectors (with adaptive thresholds) to the generated feature signal [8], [32], [41], [83], [97], [138], [139], [140].

Besides such traditional approaches, even the use of machine learning techniques to detect the HR from signal excerpts or spectra has been proposed [88], [141].

Postprocessing: Applying either method, window-based, beat-to-beat or using machine learning, can fail, particularly in case of movements. Some works therefore address the postprocessing of HR series. Applied concepts invoke Kalman filtering [48], [142], conventional outlier detection methods [130], [143] and machine learning techniques to combine signals after applying ICA to extract a robust HR by considering various spectral features [144]. Bayesian HR fusion was proposed in [145].

Evaluation: Most early works focus on reliable HR extraction. As the diagnostic value of the HR is limited, camera-based HRV analysis seems to be much more interesting and gains importance. A couple of works show the feasibility of camera-based HRV analysis [33], [72], [90], [93], [97], [137], [146], [147]. Typically, the said works found beat-to-beat intervals or the error in standard HRV measures from the time or frequency domain [148] between iPPG and a reference sensor to be small enough. However, for multiple reasons care should be taken concerning HRV analysis from iPPG. First, studies on patients and specific populations, e.g. elderly, which often show altered HRV, are largely missing. Second, the applied filtering often uses very low cut-off frequencies. Such filtering applied before beat detection lowers the temporal resolution, which is often not properly considered. Third, sampling frequencies around 30 fps, as frequently applied, generally raise doubts concerning the applicability of variability analysis and explain the differences between iPPG and a reference. The effects of the latter particularly affect high-frequency (HF) HRV measures. In fact, differences were shown in [99], where a frame rate of 30 fps was used. Even at higher frame rates differences occur. Iozzia et al. [84] and Valenza et al. [149] found systematic differences between camera-based HRV and reference HRV parameters at a frame rate of 60 fps. Sun et al. [72] used 200 fps and reported the feasibility of camera-based pulse rate variability analysis. However, the correlation of LF/HF, where LF denotes the low frequency component, between iPPG and a contact PPG drops by 10% compared to the correlation of the normalized LF and HF power. At least as far as it concerns studies that use the ECG to calculate the reference HRV found differences that might reflect the general limitations of using the pulse rate variability as a substitute for the HRV. Although the measures might be interchangeable, particularly under non-stationary conditions differences might occur [150], [151]. Besides limitations concerning the camera-based HRV analyses, investigations addressing arrhythmia detection by using cameras still show high error rates [41], which may impose problems even for HRV analyses. In conclusion, more studies, particularly those taking into account patients, should be carried out in order to provide clear evidence that today’s techniques allow for a reliable HRV analysis from cameras.

Physiological measures beyond HR

Although most works related to iPPG are directed at HR and HRV so far, there are other derivable measures and applications which are likely to gain importance in the future. The following section provides an overview of the most important research activities. It should be noted that, although many processing steps that are explained in the section “Camera-based assessment of HR and HRV” even may apply when extracting further measures, there might be restrictions, e.g. constraints may apply to the selection of ROI, the application of filters and the combination of color channels.


Most works that are directed at determining the oxygen saturation rely on the well-established principle of pulse oximetry, i.e. they record photoplethysmograms at different wavelengths and determine the oxygen saturation from the ratio of ratios [between alternating current (AC) and direct current (DC) components in both wavelengths]. Wieringa et al. [152] paved the way for camera-based oxymetry by showing that remote recordings at different wavelengths are feasible. Early works of Humphreys et al. [105] provided further steps to measure the oxygen saturation by testing a triggered monochromatic camera and illumination at 760 nm and 880 nm at the forearm. Kong et al. [69] used two monochromatic cameras equipped with narrow-band filters at 520 nm and 660 nm to determine the oxygen saturation from the area under the eyes. Fan and Li used the same wavelengths, but a monochromatic camera and motorized optical filters [153]. Shao et al. [30] used a monochromatic camera and measured at 610 nm (orange) and 880 nm (NIR) while the face was illuminated from two sides. The area around the lips is used to determine the oxygen saturation. Verkruysse et al. [18] used two cameras equipped with filters in the red and infrared range to determine SpO2 estimates invoking extended experiments on the calibration of the system. Even the usability of multispectral cameras eliminating the need for triggered illumination has been shown. Tarassenko et al. [11] estimated oxygen saturation using the blue and red channel of an RGB camera from facial videos. Guazzi et al. [78] extended the work by an adaptive ROI selection based on signal quality. Addison et al. used in an animal study the red and the green channel of an RGB camera [154].

Some works took a different approach. Mishra et al. also took advantage of a ratio of two signals. Interestingly they did not use different wavelengths but exploited the effect of polarization to generate one superficial signal and one signal from deeper layers in order to construct the ratio from which the oxygen saturation is derived [155]. Nishidate et al. estimated, based on a Monte Carlo simulation, the concentration of oxygenated and deoxygenated blood and melanin by a multiple regression analysis [156]. The oxygen saturation is derived from the concentration of oxygenated and deoxygenated blood.

Assessment of vascular state

Some recent works are directed at the iPPG’s ability to assess the vasomotor activity and the vascular state. In this regard, Trumpp et al. [73] and Bousefsaf et al. [113] showed the effects of vasomotor activity using the green channel. Kamshilin et al. applied iPPG to monitor the effect of vascular occlusion [157]. Marcinkevics et al. [24] showed that by using green and NIR light, the vasoactivity can be assessed even at different depths. Blanik et al. [158] prove the variations in the perfusion in the frequency bands related to heartbeat, respiration and the vasomotor rhythmicity in the context of allergic testing. All the aforementioned works made use of amplitude information, i.e. the strength of pulsation. Nishidate et al. [156] also assessed the vasomotor activity, but they used the total blood concentration by combining oxygenated and deoxygenated blood concentrations. Similarly, Nakano et al. [159] used this technique to estimate venous compliance. Moço et al. [160] recently proposed to derive signals from the neck in order to determine arterial stiffness and other vascular parameters by waveform analysis. The approach exploits local ballistocardiographic effects, which are dominant in proximity to the carotid artery. Even Amelard et al. focused on the neck but they showed that the jugular venous pulse can be extracted by cameras [161].

Assessment of pulse transit time and pulse wave velocity

Yang et al. [162] measured blood flow velocities from the spatial pulsation characteristics in the face using a single camera. Jeong and Finkelstein [74], Kaur et al. [163] and Shao et al. [9] estimated the pulse transit time (PTT) using one or two cameras from recordings of the face and palm(s). Murakami et al. [101] estimated the PTT using a single camera and defining ROIs at the ankle and at the wrist. Kamshilin et al. [164] measured the PTT using facial videos and electrocardiogram (ECG) showing spatial characteristics and inhomogeneity. Even Zhang et al. [165] combined non-contact and contact methods by measuring the PTT between a facial iPPG and a finger PPG. They showed a medium negative correlation to systolic blood pressure.

However, although some works show the feasibility of PTT measurements, the results are not consistent. Secerbegovic et al. [166] also recorded the palm and face using a single camera but showed the PTT, and blood pressure estimation, to be inaccurate (compared to using iPPG from the forehead and ECG). Sugita et al. [167] found a correlation between blood pressure and camera-based PTT (from facial regions and the palm), but not as expected a negative one but a positive instead.

Spatial assessment

A few works try to exploit the spatio-temporal characteristics of iPPG. Kamshilin et al. proposed a method to dynamically visualize the pulsation in amplitude and phase maps [62]. From the underlying technique, spatio-temporal parameters can be extracted. For example, Zaproudina et al. related such spatial parameters to migraine [49]. Zaunseder et al. recently proposed an algorithm to assess the spatial spread from phase maps [168]. Wieringa et al. [169] used spatial measures to show various cardiac signal effects, and Verkruysse et al. [61] exploited the spatial information to reveal differences in skin areas treated by laser therapy. Moço et al. [77] used the respective maps to locally visualize the behavior of ballistocardiographic effects. Finally, Frassineti et al. [170] analyzed the fractal behavior of maps on the phase of pulsation. In general, spatio-temporal maps are built and presented but the benefit is not always clear.

Other measures and applications

iPPG has importance even beyond the aforementioned measures. For example, Wang et al. [171], Nowara et al. [172] and Lakshminarayana et al. [173] proposed iPPG as a simple mean to identify living skin. Other researchers used iPPG to monitor/characterize wound healing and burning [66], [174]. Lastly, a couple of works are directed at the BCG, i.e. they do not exploit color information but do focus on the motion due to physiological activity to derive the BCG [31], [175], [176]. Though the BCG is primarily used for HR extraction, it offers various possibilities beyond that [177]. Moreover, extracting PPG and BCG by cameras allows the combination of both signals which can yield additional information, either by making measurements of HR, which both techniques can provide, more robust or using joint information like PTT.


As shown, iPPG contains valuable information beyond HR and thus covers a wide range of possible applications. However, as exemplarily indicated by differing findings concerning remote PTT measurements, the technique suffers from unknown influence factors leading to controversial findings. A good example is presented by [161] and [160]: both works recorded the neck. The first shows that the isolated extraction of the venous pulse is possible whereas the latter solely extracts the arterial component. Just averaging larger neck areas would just blur the resulting signal and could hinder reliable statements.

Similar to HR extraction, the number of included subjects is currently low, and patients must be included in order to establish a more profound basis, further develop methods and prove feasibility. A practically critical point is the complexity of some setups: a major advantage of iPPG is the simplicity. For example, if multiple measurement locations are required or specialized cameras are needed, the techniques lose attractiveness compared to contact-based methods.


Current state: Table 2 summarizes the research activities on iPPG. The most widespread applications are the extraction of HR and HRV. To this end, the available methods and obtained results have developed considerably over the last 10 years. To our understanding, major improvements lie in elaborated ROI definitions and combinations of color channels. An objective assessment of the available methods, their real-world applicability and obtained results, however, is hardly possible. Despite tremendous progress, it must be assumed, however, that the currently available methods do not suffice the need for real-world applications [6]. Availability and usage of larger and publicly available data sets, particularly recorded under real-world conditions including pathological cases, are highly needed to verify this statement and overcome the current limitations.

Table 2:

Works on iPPG aimed at cardiovascular parameters grouped according to their main focus.

A remarkable fact is that available knowledge in many cases is not considered properly, e.g. the transformation of color space was shown to be advantageous and spatial BSS proved to be advantageous compared to multispectral approaches. However, the said color transformation is not standard, the BSS setups vary and the combination of both findings, i.e. a combination of color transformation and spatial BSS, was not applied to the best of our knowledge. Similarly, the findings related to the origin of signals have not been fully considered so far, e.g. separating blood volume-related signals and ballistocardiographic signals can be assumed to improve the signal quality but is rarely done, i.e. in many cases available research is not properly considered by novel approaches. One reason is the high dynamics in the field and we hope to contribute to a better integration by this review.

Future perspectives: iPPG features many interesting applications, e.g. systems for stress recognition [93], [219], [221], monitoring during magnetic resonance imaging [222], monitoring anesthesia [178], [223], neonatal monitoring [45], [46] and driver monitoring [36]. Such exemplary applications underline the importance of HRV processing because the (mean) HR alone does not provide enough information for a meaningful monitoring in none of them. The increasing number of works dedicated to beat-by-beat extraction and HRV processing reflects this importance. A demanding task in either of such applications is to prove an added (clinical) value of iPPG. Naturally, early works on iPPG addressed methodological issues and restricted themselves to show feasibility. Larger (clinical) in situ studies are the next step to confirm the value of the technique and establish commercial systems and applications. Such systems probably will have to integrate differently the aforementioned functions and parameters. Moreover, adding respiratory parameters from camera recordings will provide additional benefit and pave the way toward contactless monitoring.


This work was funded by the “Bundesministerium für Bildung und Forschung” (BMBF) Funder Id: 10.13039/501100002347 (project “fast care – Kamerabasiertes Monitoring”, ref. 03ZZ0519C).


  • [1]

    Kranjec J, Beguš S, Geršak G, Drnovšek J. Non-contact heart rate and heart rate variability measurements: a review. Biomed Signal Process Control 2014;13:102–12. Google Scholar

  • [2]

    Zaunseder S, Henning A, Wedekind D, Trumpp A, Malberg H. Unobtrusive acquisition of cardiorespiratory signals. Somnologie 2017;21:93–100. Google Scholar

  • [3]

    Liu H, Wang Y, Wang L. A review of non-contact, low-cost physiological information measurement based on photoplethysmographic imaging. Conf Proc IEEE Eng Med Biol Soc 2012;2012:2088–91. Google Scholar

  • [4]

    McDuff DJ, Estepp JR, Piasecki AM, Blackford EB. A survey of remote optical photoplethysmographic imaging methods. In: 2015 37th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., (Milan, Italy), pp. 6398–404, IEEE, Aug 2015. Google Scholar

  • [5]

    Rouast PV, Adam MTP, Chiong R, Lux E. Remote heart rate measurement using low-cost RGB face video: a technical literature review. Front Comput Sci 2016;1:1–15. Google Scholar

  • [6]

    Sikdar A, Behera SK, Dogra DP. Computer-vision-guided human pulse rate estimation: a review. IEEE Rev Biomed Eng 2016;9:91–105. Google Scholar

  • [7]

    Sun Y, Thakor N. Photoplethysmography revisited: from contact to noncontact, from point to imaging. IEEE Trans Biomed Eng 2016;63:463–77. Google Scholar

  • [8]

    Bousefsaf F, Maaoui C, Pruski A. Continuous wavelet filtering on webcam photoplethysmographic signals to remotely assess the instantaneous heart rate. Biomed Signal Process Control 2013;8:568–74. Google Scholar

  • [9]

    Shao D, Yang Y, Liu C, Tsow F, Yu H, Tao N. Noncontact monitoring breathing pattern, exhalation flow rate and pulse transit time. IEEE Trans Biomed Eng 2014;61:2760–7. Google Scholar

  • [10]

    McDuff D, Gontarek S, Picard RW. Improvements in remote cardiopulmonary measurement using a five band digital camera. IEEE Trans Biomed Eng 2014;61:2593–601. Google Scholar

  • [11]

    Tarassenko L, Villarroel M, Guazzi A, Jorge J, Clifton DA, Pugh C. Non-contact video-based vital sign monitoring using ambient light and auto-regressive models. Physiol Meas 2014;35:807–31. Google Scholar

  • [12]

    Janssen R, Wang W, Moço A, de Haan G. Video-based respiration monitoring with automatic region of interest detection. Physiol Meas 2016;37:100–14. Google Scholar

  • [13]

    van Gastel M, Stuijk S, de Haan G. Robust respiration detection from remote photoplethysmography. Biomed Opt Express 2016;7:4941–57. Google Scholar

  • [14]

    Wei B, He X, Zhang C, Wu X. Non-contact, synchronous dynamic measurement of respiratory rate and heart rate based on dual sensitive regions. Biomed Eng Online 2017;16:1–21. Google Scholar

  • [15]

    Allen J. Photoplethysmography and its application in clinical physiological measurement. Physiol Meas 2007;28:R1–39. Google Scholar

  • [16]

    Reisner A, Shaltis PA, McCombie D, Asada HH. Utility of the photoplethysmogram in circulatory monitoring. Anesthesiology 2008;08:950–8. Google Scholar

  • [17]

    Hülsbusch M. Ein bildgestütztes, funktionelles Verfahren zur optoelektronischen Erfassung der Hautperfusion. PhD thesis, RWTH Aachen, 2008. Google Scholar

  • [18]

    Verkruysse W, Bartula M, Bresch E, Rocque M, Meftah M, Kirenko I. Calibration of contactless pulse oximetry. Anesth Analg 2017;124:136–45. Google Scholar

  • [19]

    Moço AV, Stuijk S, de Haan G. Skin inhomogeneity as a source of error in remote PPG-imaging. Biomed Opt Express 2016;7:4718–33. Google Scholar

  • [20]

    Kamshilin AA, Nippolainen E, Sidorov IS, Vasilev PV, Erofeev NP, Podolian NP, et al. A new look at the essence of the imaging photoplethysmography. Sci Rep 2015;5:10494. Google Scholar

  • [21]

    Sidorov IS, Romashko RV, Koval VT, Giniatullin R, Kamshilin AA. Origin of infrared light modulation in reflectance-mode photoplethysmography. PLoS One 2016;11:1–11. Google Scholar

  • [22]

    Moco AV, Stuijk S, de Haan G. Ballistocardiographic artifacts in PPG imaging. IEEE Trans Biomed Eng 2016;63:1804–11. Google Scholar

  • [23]

    Butler MJ, Crowe JA, Hayes-Gill BR, Rodmell PI. Motion limitations of non-contact photoplethysmography due to the optical and topological properties of skin. Physiol Meas 2016;37:N27–37. Google Scholar

  • [24]

    Marcinkevics Z, Rubins U, Zaharans J, Miscuks A, Urtane E, Ozolina-Moll L. Imaging photoplethysmography for clinical assessment of cutaneous microcirculation at two different depths. J Biomed Opt 2016;21:35005. Google Scholar

  • [25]

    Bashkatov AN, Genina EA, Kochubey VI, Tuchin VV. Optical properties of human skin, subcutaneous and mucous tissues in the wavelength range from 400 to 2000 nm. J Phys D Appl Phys 2005;38:2543–55. Google Scholar

  • [26]

    Kolarsick PAJ, Kolarsick MA, Goodwin C. Anatomy and physiology of the skin. J Dermatol Nurses Assoc 2011;3:203–13. Google Scholar

  • [27]

    Hu S, Azorin-Peris V, Zheng J. Opto-physiological modeling applied to photoplethysmographic cardiovascular assessment. J Healthc Eng 2013;4:505–28. Google Scholar

  • [28]

    Trumpp A, Bauer PL, Rasche S, Malberg H, Zaunseder S. The value of polarization in camera-based photoplethysmography. Biomed Opt Express 2017;8:2822–34. Google Scholar

  • [29]

    Hassan MA, Malik AS, Fofi D, Saad N, Karasfi B, Ali YS, et al. Heart rate estimation using facial video: a review. Biomed. Signal Process. Control 2017;38:346–60. Google Scholar

  • [30]

    Shao D, Liu C, Tsow F, Yang Y, Du Z, Iriya R, et al. Noncontact monitoring of blood oxygen saturation using camera and dual-wavelength imaging system. IEEE Trans Biomed Eng 2016;63:1091–8. Google Scholar

  • [31]

    Shao D, Tsow F, Liu C, Yang Y, Tao N. Simultaneous monitoring of ballistocardiogram and photoplethysmogram using a camera. IEEE Trans Biomed Eng 2017;64:1003–10. Google Scholar

  • [32]

    Bousefsaf F, Maaoui C, Pruski A. Automatic selection of webcam photoplethysmographic pixels based on lightness criteria. J Med Biol Eng 2017;37:374–85. Google Scholar

  • [33]

    McDuff D, Blackford E, Estepp J. Fusing partial camera signals for non-contact pulse rate variability measurement. IEEE Trans Biomed Eng 2017:1–1. Google Scholar

  • [34]

    Po L-M, Feng L, Li Y, Xu X, Cheung TC-H, Cheung K-W. Block-based adaptive ROI for remote photoplethysmography. Multimed Tools Appl 2018;77:6503–29. Google Scholar

  • [35]

    Stricker R, Muller S, Gross H-M. Non-contact video-based pulse rate measurement on a mobile service robot. In: 23rd IEEE Int. Symp. Robot Hum. Interact. Commun., Edinburgh, UK, pp. 1056–62, IEEE, Aug 2014. Google Scholar

  • [36]

    Kuo J, Koppel S, Charlton JL, Rudin-Brown CM. Evaluation of a video-based measure of driver heart rate. J Safety Res 2015;54:55–9. Google Scholar

  • [37]

    Qi H, Wang ZJ, Miao C. Non-contact driver cardiac physiological monitoring using video data. In: 2015 IEEE China Summit Int Conf Signal Inf Process., vol. 3, Chengdu, China, pp. 418–22, IEEE, Jul 2015. Google Scholar

  • [38]

    Sun Y, Hu S, Azorin-Peris V, Greenwald S, Chambers J, Zhu Y. Motion-compensated noncontact imaging photoplethysmography to monitor cardiorespiratory status during exercise. J Biomed Opt 2011;16:77010. Google Scholar

  • [39]

    de Haan G, van Leest A. Improved motion robustness of remote-PPG by using the blood volume pulse signature. Physiol Meas 2014;35:1913–26. Google Scholar

  • [40]

    Rasche S, Trumpp A, Waldow T, Gaetjen F, Plötze K, Wedekind D, et al. Camera-based photoplethysmography in critical care patients. Clin Hemorheol Microcirc 2016;64:77–90. Google Scholar

  • [41]

    Couderc J-P, Kyal S, Mestha LK, Xu B, Peterson DR, Xia X, et al. Detection of atrial fibrillation using contactless facial video monitoring. Hear Rhythm 2015;12:195–201. Google Scholar

  • [42]

    Amelard R, Clausi DA, Wong A. Spectral-spatial fusion model for robust blood pulse waveform extraction in photoplethysmographic imaging. Biomed Opt Express 2016;7:4874. Google Scholar

  • [43]

    Aarts LA, Jeanne V, Cleary JP, Lieber C, Nelson JS, Bambang Oetomo S, et al. Non-contact heart rate monitoring utilizing camera photoplethysmography in the neonatal intensive care unit ? A pilot study. Early Hum Dev 2013;89:943–8. Google Scholar

  • [44]

    Blanik N, Heimann K, Pereira C, Paul M, Blazek V, Venema B, et al. Remote vital parameter monitoring in neonatology – robust, unobtrusive heart rate detection in a realistic clinical scenario. Biomed Tech (Berl) 2016;61:631–43. Google Scholar

  • [45]

    Scalise L, Bernacchia N, Ercoli I, Marchionni P. Heart rate measurement in neonatal patients using a webcamera. In: 2012 IEEE Int. Symp. Med. Meas. Appl. Proc., pp. 1–4, IEEE, may 2012. Google Scholar

  • [46]

    Villarroel M, Guazzi A, Jorge J, Davis S, Watkinson P, Green G, et al. Continuous non-contact vital sign monitoring in neonatal intensive care unit. Healthc Technol Lett 2014;1: 87–91. Google Scholar

  • [47]

    Trumpp A, Lohr J, Wedekind D, Schmidt M, Burghardt M, Heller AR, et al. Camera-based photoplethysmography in an intraoperative setting. Biomed Eng Online 2018;17:33. Google Scholar

  • [48]

    Villarroel M, Jorge J, Pugh C, Tarassenko L. Non-contact vital sign monitoring in the clinic. In: 2017 12th IEEE Int. Conf. Autom. Face Gesture Recognit. (FG 2017), Washington, DC, USA, pp. 278–85, 2017. Google Scholar

  • [49]

    Zaproudina N, Teplov V, Nippolainen E, Lipponen JA, Kamshilin AA, Närhi M, et al. Asynchronicity of facial blood perfusion in migraine. PLoS One 2013;8:e80189. Google Scholar

  • [50]

    Trumpp A, Rasche S, Wedekind D, Schmidt M, Waldow T, Gaetjen F, et al. Skin detection and tracking for camera-based photoplethysmography using a Bayesian classifier and level set segmentation. In: Bild für die Medizin 2017;2017:43–8. Google Scholar

  • [51]

    Addison PS, Jacquel D, Foo DM, Borg UR. Video-based heart rate monitoring across a range of skin pigmentations during an acute hypoxic challenge. J Clin Monit Comput 2017; 0:1–10. Google Scholar

  • [52]

    Choe J, Chung D, Schwichtenberg AJ, Delp EJ. Improving video-based resting heart rate estimation: a comparison of two methods. In: 2015 IEEE 58th Int. Midwest Symp. Circuits Syst., vol. 58, (Fort Collins, USA), pp. 1–4, IEEE, aug 2015. Google Scholar

  • [53]

    de Haan G, Jeanne V. Robust pulse rate from chrominance-based rPPG. IEEE Trans Biomed Eng 2013;60:2878–86. Google Scholar

  • [54]

    Wang W, den Brinker AC, Stuijk S, de Haan G. Algorithmic principles of remote PPG. IEEE Trans Biomed Eng 2017;64: 1479–91. Google Scholar

  • [55]

    Pilz CS, Zaunseder S, Canzler U, Krajewski J. Heart rate from face videos under realistic conditions for advanced driver monitoring. Curr Dir Biomed Eng 2017;3:483–7. Google Scholar

  • [56]

    Kumar M, Veeraraghavan A, Sabharwal A. DistancePPG: robust non-contact vital signs monitoring using a camera. Biomed Opt Express 2015;6:1565. Google Scholar

  • [57]

    McDuff DJ, Blackford EB, Estepp JR. The impact of video compression on remote cardiac pulse measurement using imaging photoplethysmography. In: 2017 12th IEEE Int. Conf. Autom. Face Gesture Recognit. (FG 2017), Washington, DC, USA, pp. 63–70, IEEE, may 2017. Google Scholar

  • [58]

    Soleymani M, Lichtenauer J, Pun T, Pantic M. A multimodal database for affect recognition and implicit tagging. IEEE Trans Affect Comput 2012;3:42–55. Google Scholar

  • [59]

    Bobbia S, Macwan R, Benezeth Y, Mansouri A, Dubois J. Unsupervised skin tissue segmentation for remote photoplethysmography. Pattern Recognit Lett 2017;0:1–9. Google Scholar

  • [60]

    Koelstra S, Mühl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, et al. DEAP: a database for emotion analysis; Using physiological signals. IEEE Trans Affect Comput 2012;3:18–31. Google Scholar

  • [61]

    Verkruysse W, Svaasand LO, Nelson JS. Remote plethysmographic imaging using ambient light. Opt Express 2008;16:21434–45. Google Scholar

  • [62]

    Kamshilin AA, Miridonov S, Teplov V, Saarenheimo R, Nippolainen E. Photoplethysmographic imaging of high spatial resolution. Biomed Opt Express 2011;2:996–1006. Google Scholar

  • [63]

    Sidorov IS, Volynsky MA, Kamshilin AA. Influence of polarization filtration on the information readout from pulsating blood vessels. Biomed Opt Express 2016;7:2469. Google Scholar

  • [64]

    Jeanne V, Asselman M, den Brinker B, Bulut M. Camera-based heart rate monitoring in highly dynamic light conditions. In: 2013 Int. Conf. Connect. Veh. Expo, pp. 798–9, IEEE, dec 2013. Google Scholar

  • [65]

    Wu T, Blazek V, Schmitt HJ. Photoplethysmography imaging: a new noninvasive and noncontact method for mapping of the dermal perfusion changes. In: Priezzhev AV, Oberg PA, editors. Proc. SPIE 4163, Amsterdam, Netherlands, SPIE; 2000:62. Google Scholar

  • [66]

    Huelsbusch M, Blazek V. Contactless mapping of rhythmical phenomena in tissue perfusion using PPGI. In: Clough AV, Chen C-T, editors. Proc. SPIE. 2002;4683:110. Google Scholar

  • [67]

    van Gastel M, Stuijk S, de Haan G. Motion robust remote-PPG in infrared. IEEE Trans Biomed Eng 2015;62:1425–33. Google Scholar

  • [68]

    Lempe G, Zaunseder S, Wirthgen T, Zipser S, Malberg H. ROI selection for remote photoplethysmography. In: Meinzer H-P, Deserno TM, Handels H, Tolxdorff T, editors. Bild. für die Medizin 2013. Heidelberg: Springer; 2013:99–103. Google Scholar

  • [69]

    Kong L, Zhao Y, Dong L, Jian Y, Jin X, Li B, et al. Non-contact detection of oxygen saturation based on visible light imaging device using ambient light. Opt Express 2013;21:17464. Google Scholar

  • [70]

    Amelard R, Scharfenberger C, Wong A, Clausi DA. Illumination-compensated non-contact imaging photoplethysmography via dual-mode temporally coded illumination. In: SPIE BiOS 2015;9316:931607. Google Scholar

  • [71]

    Estepp JR, Blackford EB, Meier CM. Recovering pulse rate during motion artifact with a multi-imager array for non-contact imaging photoplethysmography. In: 2014 IEEE Int. Conf. Syst. Man, Cybern., vol. 2014-Janua, San Diego, CA, USA, pp. 1462–9, IEEE, Oct 2014. Google Scholar

  • [72]

    Sun Y, Hu S, Azorin-Peris V, Kalawsky R, Greenwald S. Noncontact imaging photoplethysmography to effectively access pulse rate variability. J Biomed Opt 2013;18:061205. Google Scholar

  • [73]

    Trumpp A, Schell J, Malberg H, Zaunseder S. Vasomotor assessment by camera-based photoplethysmography. Curr Dir Biomed Eng 2016;2:199–202. Google Scholar

  • [74]

    Jeong IC, Finkelstein J. Introducing contactless blood pressure assessment using a high speed video camera. J Med Syst 2016;40:77. Google Scholar

  • [75]

    Blackford EB, Estepp JR. Using consumer-grade devices for multi-imager non-contact imaging photoplethysmography. In: Coté GL, editor. Opt. Diagnostics Sens. XVII Towar. Point-of-Care Diagnostics. San Francisco, California, United States: SPIE 2017;10072:100720P. Google Scholar

  • [76]

    Kamshilin AA, Teplov V, Nippolainen E, Miridonov S, Giniatullin R. Variability of microcirculation detected by blood pulsation imaging. PLoS One 2013;8:e57117. Google Scholar

  • [77]

    Moço AV, Stuijk S, de Haan G. Motion robust PPG-imaging through color channel mapping. Biomed Opt Express 2016;7:1737. Google Scholar

  • [78]

    Guazzi AR, Villarroel M, Jorge J, Daly J, Frise MC, Robbins PA, et al. Non-contact measurement of oxygen saturation with an RGB camera. Biomed Opt Express 2015;6:3320. Google Scholar

  • [79]

    Koprowski R. Blood pulsation measurement using cameras operating in visible light: limitations. Biomed Eng Online 2016;15:111. Google Scholar

  • [80]

    Liu H, Wang Y, Wang L. The effect of light conditions on photoplethysmographic image acquisition using a commercial camera. IEEE J Transl Eng Heal Med 2014;2:1–11. Google Scholar

  • [81]

    Viola P, Jones M. Rapid object detection using a boosted cascade of simple features. In: Proc. 2001 IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognition. CVPR 2001, vol. 1, Kauai, HI, USA, pp. I–511–I–518, IEEE Comput. Soc, 2001. Google Scholar

  • [82]

    Kwon S, Kim J, Lee D, Park K. ROI analysis for remote photoplethysmography on facial video. In: 2015 37th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., vol. 2015, Milan, Italy, pp. 4938–41, IEEE, Aug 2015. Google Scholar

  • [83]

    Antink CH, Gao H, Brüser C, Leonhardt S. Beat-to-beat heart rate estimation fusing multimodal video and sensor data. Biomed Opt Express 2015;6:2895. Google Scholar

  • [84]

    Iozzia L, Cerina L, Mainardi L. Relationships between heart-rate variability and pulse-rate variability obtained from video-PPG signal using ZCA. Physiol Meas 2016;37:1934–44. Google Scholar

  • [85]

    Lewandowska M, Ruminski J, Kocejko T, Nowak J. Measuring pulse rate with a webcam – a non-contact method for evaluating cardiac activity. In: Ganzha M, Maciaszek LA, Paprzycki M, editors. FedCSIS, Szczecin, Poland; 2011:405–10. Google Scholar

  • [86]

    Yu Y-P, Raveendran P, Lim C-L. Dynamic heart rate measurements from video sequences. Biomed Opt Express 2015;6:2466–80. Google Scholar

  • [87]

    Holton BD, Mannapperuma K, Lesniewski PJ, Thomas JC. Signal recovery in imaging photoplethysmography. Physiol Meas 2013;34:1499–511. Google Scholar

  • [88]

    Hsu Y, Lin Y-L, Hsu W. Learning-based heart rate detection from remote photoplethysmography features. In: 2014 IEEE Int. Conf. Acoust. Speech Signal Process. (Florence, Italy), pp. 4433–7, IEEE, may 2014. Google Scholar

  • [89]

    Poh M-Z, McDuff DJ, Picard RW. Non-contact, automated cardiac pulse measurements using video imaging and blind source separation. Opt Express 2010;18:10762–74. Google Scholar

  • [90]

    Poh M-Z, McDuff DJ, Picard RW. Advancements in noncontact, multiparameter physiological measurements using a webcam. IEEE Trans Biomed Eng 2011;58:7–11. Google Scholar

  • [91]

    Asthana A, Zafeiriou S, Cheng S, Pantic M. Incremental face alignment in the wild. In: 2014 IEEE Conf. Comput. Vis. Pattern Recognit., Columbus, OH, USA, pp. 1859–66, IEEE, Jun 2014. Google Scholar

  • [92]

    Saragih JM, Lucey S, Cohn JF. Deformable model fitting by regularized landmark mean-shift. Int J Comput Vis 2011;91:200–15. Google Scholar

  • [93]

    McDuff D, Gontarek S, Picard RW. Remote detection of photoplethysmographic systolic and diastolic peaks using a digital camera. IEEE Trans Biomed Eng 2014;61:2948–54. Google Scholar

  • [94]

    Martinez B, Valstar MF, Binefa X, Pantic M. Local evidence aggregation for regression-based facial point detection. IEEE Trans Pattern Anal Mach Intell 2013;35:1149–63. Google Scholar

  • [95]

    Trumpp A, Malberg H, Zaunseder S. Signal extraction in camera-based photoplethysmography using a modified Wiener filter. In: 5. Dresdner Medizintechnik Symp., Dresden, Germany, pp. 105–7, 2014. Google Scholar

  • [96]

    Bal U. Non-contact estimation of heart rate and oxygen saturation using ambient light. Biomed Opt Express 2015;6:86–97. Google Scholar

  • [97]

    Huang R-Y, Dung L-R. Measurement of heart rate variability using off-the-shelf smart phones. Biomed Eng Online 2016;15:11. Google Scholar

  • [98]

    Bobbia S, Benezeth Y, Dubois J. Remote photoplethysmography based on implicit living skin tissue segmentation. In: 2016 23rd Int. Conf. Pattern Recognit., Cancun, Mexico, pp. 361–365, IEEE, Dec 2016. Google Scholar

  • [99]

    Moreno J, Ramos-Castro J, Movellan J, Parrado E, Rodas G, Capdevila L. Facial video-based photoplethysmography to detect HRV at rest. Int J Sports Med 2015;36:474–80. Google Scholar

  • [100]

    Wang W, Stuijk S, de Haan G. A novel algorithm for remote photoplethysmography: spatial subspace rotation. IEEE Trans Biomed Eng 2016;63:1974–84. Google Scholar

  • [101]

    Murakami K, Yoshioka M, Ozawa J. Non-contact pulse transit time measurement using imaging camera, and its relation to blood pressure. In: 2015 14th IAPR Int. Conf. Mach. Vis. Appl., Tokyo, Japan, pp. 414–7, IEEE, May 2015. Google Scholar

  • [102]

    Kviesis-Kipge E, Rubins U. Portable remote photoplethysmography device for monitoring of blood volume changes with high temporal resolution. In: 2016 15th Bienn. Balt. Electron. Conf., vol. c, Tallinn, Estonia, pp. 55–8, IEEE, Oct 2016. Google Scholar

  • [103]

    Lee K-Z, Hung P-C, Tsai L-W. Contact-free heart rate measurement using a camera. In: 2012 Ninth Conf. Comput. Robot Vis., Toronto, ON, Canada, pp. 147–52, IEEE, May 2012. Google Scholar

  • [104]

    Feng L, Po L-M, Xu X, Li Y, Cheung C-H, Cheung K-W, et al. Dynamic ROI based on K-means for remote photoplethysmography. In: 2015 IEEE Int. Conf. Acoust. Speech Signal Process., vol. 2015-Augus, Brisbane, QLD, Australia, pp. 1310–4, IEEE, Apr 2015. Google Scholar

  • [105]

    Humphreys K, Ward T, Markham C. Noncontact simultaneous dual wavelength photoplethysmography: a further step toward noncontact pulse oximetry. Rev Sci Instrum 2007;78:44304. Google Scholar

  • [106]

    Blanik N, Abbas AK, Venema B, Blazek V, Leonhardt S. Hybrid optical imaging technology for long-term remote monitoring of skin perfusion and temperature behavior. J Biomed Opt 2014;19:16012. Google Scholar

  • [107]

    Lucas BD, Kanade T. An iterative image registration technique with an application to stereo vision. In: Proc. Imaging Underst. Work., pp. 121–30, 1981. Google Scholar

  • [108]

    Tomasi C, Kanade T. Detection and tracking of point features – Technical Report CMU-CS-91-132. Tech Rep 1991. Google Scholar

  • [109]

    Gambi E, Agostinelli A, Belli A, Burattini L, Cippitelli E, Fioretti S, et al. Heart rate detection using microsoft kinect: validation and comparison to wearable devices. Sensors 2017;17:1776. Google Scholar

  • [110]

    Tarbox EA, Rios C, Kaur B, Meyer S, Hirt L, Tran V, et al. Motion correction for improved estimation of heart rate using a visual spectrum camera. In: Cullum BM, Kiehl D, McLamore ES, editors. Proc. SPIE 10216, Smart Biomed. Physiol. Sens. Technol., Anaheim, California, United States; 2017;10216:1021607. Google Scholar

  • [111]

    Henriques JF, Caseiro R, Martins P, Batista J. Exploiting the circulant structure of tracking-by-detection with kernels. In: Comput. Vis. – ECCV 2012, Florence, Italy, pp. 702–715, Berlin, Heidelberg: Springer, 2012. Google Scholar

  • [112]

    Farnebäck G. Two-frame motion estimation based on polynomial expansion. In: Image Anal. Tromso, Norway: Springer Berlin Heidelberg; 2003:367–70. Google Scholar

  • [113]

    Bousefsaf F, Maaoui C, Pruski A. Peripheral vasomotor activity assessment using a continuous wavelet analysis on webcam photoplethysmographic signals. Biomed Mater Eng 2016;27:527–38. Google Scholar

  • [114]

    Yang Y, Liu C, Yu H, Shao D, Tsow F, Tao N. Motion robust remote photoplethysmography in CIELab color space. J Biomed Opt 2016;21:117001. Google Scholar

  • [115]

    Rumiński J. Reliability of pulse measurements in videoplethysmography. Metrol Meas Syst 2016;23:359–71. Google Scholar

  • [116]

    Lueangwattana C, Kondo T, Haneishi H. A Comparative study of video signals for non-contact heart rate measurement. In: 12th Int. Conf. Electr. Eng. Comput. Telecommun. Inf. Technol., (Hua Hin, China), pp. 1–5, 2015. Google Scholar

  • [117]

    Tsouri GR, Li Z. On the benefits of alternative color spaces for noncontact heart rate measurements using standard red-green-blue cameras. J Biomed Opt 2015;20:048002. Google Scholar

  • [118]

    Xu S, Sun L, Rohde GK. Robust efficient estimation of heart rate pulse from video. Biomed Opt Express 2014;5:1124–35. Google Scholar

  • [119]

    Wang W, Stuijk S, de Haan G. A novel algorithm for remote photoplethysmography: spatial subspace rotation. IEEE Trans Biomed Eng 2016;63:1974–84. Google Scholar

  • [120]

    Wedekind D, Malberg H, Zaunseder S, Gaetjen F, Matschke K, Rasche S. Automated identification of cardiac signals after blind source separation for camera-based photoplethysmography. In: 2015 IEEE 35th Int. Conf. Electron. Nanotechnol., Kiev, Ukraine, pp. 422–7, IEEE, Apr 2015. Google Scholar

  • [121]

    Comon P. Independent component analysis, a new concept? Signal Process 1994;36:287–314. Google Scholar

  • [122]

    Tsouri GR, Kyal S, Dianat S, Mestha LK. Constrained independent component analysis approach to nonobtrusive pulse rate measurements. J Biomed Opt 2012;17:077011. Google Scholar

  • [123]

    Christinaki E, Giannakakis G, Chiarugi F, Pediaditis M, Iatraki G, Manousos D, et al. Comparison of blind source separation algorithms for optical heart rate monitoring. In: Proc. 4th Int. Conf. Wirel. Mob. Commun. Healthc. – “Transforming Healthc. through Innov. Mob. Wirel. Technol., (Athens, Greece), pp. 339–42, ICST, 2014. Google Scholar

  • [124]

    Kwon S, Kim H, Parlk SP. Validation of heart rate extraction using video imaging on a built-in camera system of a smartphone. In: Eng. Med. Biol. Soc. (EMBC), 2012 Annu. Int. Conf. IEEE, San Diego, California, USA, pp. 2174–7, 2012. Google Scholar

  • [125]

    Yang L, Liu M, Dong L, Zhao Y, Liu X. Motion-compensated non-contact detection of heart rate. Opt Commun 2015;357:161–8. Google Scholar

  • [126]

    Feng L, Po L-M, Xu X, Li Y. Motion artifacts suppression for remote imaging photoplethysmography. In: 2014 19th Int. Conf. Digit. Signal Process., pp. 18–23, IEEE, Aug 2014. Google Scholar

  • [127]

    Wedekind D, Trumpp A, Gaetjen F, Rasche S, Matschke K, Malberg H, et al. Assessment of blind source separation techniques for video-based cardiac pulse extraction. J Biomed Opt 2017;22:035002. Google Scholar

  • [128]

    Lam A, Kuno Y. Robust heart rate measurement from video using select random patches. Proc IEEE Int Conf Comput Vis 2016;11–18-Dece:3640–8. Google Scholar

  • [129]

    Cheng J, Chen X, Xu L, Wang ZJ. Illumination variation-resistant video-based heart rate measurement using joint blind source separation and ensemble empirical mode decomposition. IEEE J Biomed Heal Informatics 2017;21:1422–33. Google Scholar

  • [130]

    Lee JS, Lin KW, Syue JL. Smartphone-based heart-rate measurement using facial images and a spatiotemporal alpha-trimmed mean filter. Technol Heal Care 2016;24:S777–83. Google Scholar

  • [131]

    Wang W, den Brinker AC, Stuijk S, de Haan G. Amplitude-selective filtering for remote-PPG. Biomed Opt Express 2017;8:1965. Google Scholar

  • [132]

    Wu B-F, Huang P-W, Tsou T-Y, Lin T-M, Chung M-L. Camera-based heart rate measurement using continuous wavelet transform. In: Int. Conf. onSystem Sci. Eng., pp. 7–11, 2017. Google Scholar

  • [133]

    Sun X, Yang P, Li Y, Gao Z, Yuan-Ting Z. Robust heart beat detection from photoplethysmography interlaced with motion artifacts based on Empirical Mode Decomposition. In: Proc. 2012 IEEE-EMBS Int. Conf. Biomed. Heal. Informatics, pp. 775–8, IEEE, jan 2012. Google Scholar

  • [134]

    Jiang WJ, Gao SC, Wittek P, Zhao L. Real-time quantifying heart beat rate from facial video recording on a smart phone using Kalman filters. In: 2014 IEEE 16th Int. Conf. e-Health Networking, Appl. Serv., Natal, Brazil, pp. 393–6, IEEE, Oct 2014. Google Scholar

  • [135]

    Allen J, Murray A. Effects of filtering on multi-site photoplethysmography pulse waveform characteristics. In: Comput. Cardiol., Chicago, IL, USA, pp. 485–8, IEEE, 2004. Google Scholar

  • [136]

    Takano C, Ohta Y. Heart rate measurement based on a time-lapse image. Med Eng Phys 2007;29:853–7. Google Scholar

  • [137]

    Alghoul K, Alharthi S, Al Osman H, El Saddik A. Heart rate variability extraction from videos signals: ICA vs. EVM comparison. IEEE Access 2017;5:4711–9. Google Scholar

  • [138]

    Capdevila L, Moreno J, Movellan J, Parrado E, Ramos-Castro J. HRV based health&sport markers using video from the face. Conf. Proc. ... Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. IEEE Eng. Med. Biol. Soc. Annu. Conf. 2012;2012:5646–9. Google Scholar

  • [139]

    Lin Y-C, Chou N-K, Lin G-Y, Li M-H, Lin Y-H. A real-time contactless pulse rate and motion status monitoring system based on complexion tracking. Sensors 2017;17:1490. Google Scholar

  • [140]

    Zaunseder S, Heinke A, Trumpp A, Malberg H. Heart beat detection and analysis from videos. In: 2014 IEEE 34th Int. Sci. Conf. Electron. Nanotechnol., pp. 286–90, IEEE, Apr 2014. Google Scholar

  • [141]

    Osman A, Turcot J, El Kaliouby R. Supervised learning approach to remote heart rate estimation from facial videos. In: 11th IEEE Int. Conf. Work. Autom. Face Gesture Recognit., vol. 11, (Ljubljana, Slovenia), pp. 1–6, 2015. Google Scholar

  • [142]

    Andreotti F, Trumpp A, Malberg H, Zaunseder S. Improved heart rate detection for camera-based photoplethysmography by means of Kalman filtering. In: 2015 IEEE 35th Int. Conf. Electron. Nanotechnology, ELNANO 2015 – Conf. Proc., Kiev, Ukraine, pp. 428–33, IEEE, Apr 2015. Google Scholar

  • [143]

    Kyal S, Mestha LK, Xu B, Couderc J-P. A method to detect cardiac arrhythmias with a webcam. In: 2013 IEEE Signal Process. Med. Biol. Symp., Brooklyn, NY, USA, pp. 1–5, IEEE, Dec 2013. Google Scholar

  • [144]

    Monkaresi H, Calvo RA, Yan H. A machine learning approach to improve contactless heart rate monitoring using a webcam. IEEE J Biomed Heal Informatics 2014;18:1153–60. Google Scholar

  • [145]

    Wartzek T, Bruser C, Walter M, Leonhardt S. Robust sensor fusion of unobtrusively measured heart rate. IEEE J Biomed Heal Informatics 2014;18:654–60. Google Scholar

  • [146]

    Fukunishi M, Mcduff D, Tsumura N. Improvements in remote video based estimation of heart rate variability using the Welch FFT method. Artif Life Robot 2018;23:15–22. Google Scholar

  • [147]

    Wu H-T, Lewis GF, Davila MI, Daubechies I, Porges SW. Optimizing estimates of instantaneous heart rate from pulse wave signals with the synchrosqueezing transform. Methods Inf Med 2016;55:463–72. Google Scholar

  • [148]

    Task Force of the European Society of Cardiology and the North American Society of Pacing and Electrophysiology. Heart rate variability: standards of measurement, physiological interpretation and clinical use. Circulation 1996;93:1043–65. Google Scholar

  • [149]

    Valenza G, Iozzia L, Cerina L, Mainardi L, Barbieri R. Assessment of instantaneous cardiovascular dynamics from video plethysmography. In: 2017 39th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., vol. 13, Seogwipo, South Korea, pp. 1776–9, IEEE, Jul 2017. Google Scholar

  • [150]

    Gil E, Orini M, Bailón R, Vergara JM, Mainardi L, Laguna P. Photoplethysmography pulse rate variability as a surrogate measurement of heart rate variability during non-stationary conditions. Physiol Meas 2010;31:1271–90. Google Scholar

  • [151]

    Schäfer A, Vagedes J. How accurate is pulse rate variability as an estimate of heart rate variability? A review on studies comparing photoplethysmographic technology with an electrocardiogram. Int J Cardiol 2013;166:15–29. Google Scholar

  • [152]

    Wieringa F, Mastik F, Boks R, Visscher A, Bogers A, Van der Steen A. In vitro demonstration of an SpO2-camera. In: 2007 Comput. Cardiol., Durham, NC, USA, pp. 749–51, IEEE, Sep 2007. Google Scholar

  • [153]

    Fan Q, Li K. Noncontact imaging plethysmography for accurate estimation of physiological parameters. J Med Biol Eng 2017;37:675–85. Google Scholar

  • [154]

    Addison PS, Jacquel D, Foo DM, Antunes A, Borg UR. Video-based physiologic monitoring during an acute hypoxic challenge: heart rate, respiratory rate, and oxygen saturation. Anesth Analg 2017;125:860–73. Google Scholar

  • [155]

    Mishra D, Priyadarshini N, Chakraborty S, Sarkar M. Blood oxygen saturation measurement using polarization-dependent optical sectioning. IEEE Sens J 2017;17:1–1. Google Scholar

  • [156]

    Nishidate I, Hoshi A, Aoki Y, Nakano K, Niizeki K, Aizu Y. Noncontact imaging of plethysmographic pulsation and spontaneous low-frequency oscillation in skin perfusion with a digital red-green-blue camera. In: Tuchin VV, Larin KV, Leahy MJ, Wang RK, editors. Proc. SPIE 9707, Dyn. Fluctuations Biomed. Photonics. San Francisco, USA; 2016;9707:97070L. Google Scholar

  • [157]

    Kamshilin AA, Zaytsev VV, Mamontov OV. Novel contactless approach for assessment of venous occlusion plethysmography by video recordings at the green illumination. Sci Rep 2017;7:464. Google Scholar

  • [158]

    Blanik N, Blazek C, Pereira C, Blazek V, Leonhardt S. Frequency-selective quantification of skin perfusion behavior during allergic testing using photoplethysmography imaging. In: SPIE Med. Imaging, vol. 9034, San Diego, California, United States, p. 903429, 2014. Google Scholar

  • [159]

    Nakano K, Satoh R, Hoshi A, Matsuda R, Suzuki H, Nishidate I. Non-contact imaging of venous compliance in humans using an RGB camera. Opt Rev 2015;22:335–41. Google Scholar

  • [160]

    Moço AV, Zavala Mondragon LA, Wang W, Stuijk S, de Haan G. Camera-based assessment of arterial stiffness and wave reflection parameters from neck micro-motion. Physiol Meas 2017;38:1576–98. Google Scholar

  • [161]

    Amelard R, Hughson RL, Greaves DK, Pfisterer KJ, Leung J, Clausi DA, et al. Non-contact hemodynamic imaging reveals the jugular venous pulse waveform. Sci Rep 2017;7:40150. Google Scholar

  • [162]

    Yang J, Guthier B, Saddik AE. Estimating two-dimensional blood flow velocities from videos. In: 2015 IEEE Int. Conf. Image Process., Quebec City, QC, Canada, pp. 3768–72, IEEE, Sep 2015. Google Scholar

  • [163]

    Kaur B, Moses S, Luthra M, Ikonomidou VN. Remote heartbeat signal detection from visible spectrum recordings based on blind deconvolution. In: Proc. SPIE 9871, Sens. Anal. Technol. Biomed. Cogn. Appl., vol. 9871, (Baltimore, USA), pp. 987103–9, 2016. Google Scholar

  • [164]

    Kamshilin AA, Sidorov IS, Babayan L, Volynsky MA, Giniatullin R, Mamontov OV. Accurate measurement of the pulse wave delay with imaging photoplethysmography. Biomed Opt Express 2016;7:5138–47. Google Scholar

  • [165]

    Zhang G, Shan C, Kirenko I, Long X, Aarts RM. Hybrid optical unobtrusive blood pressure measurements. Sensors (Basel) 2017;17:1541. Google Scholar

  • [166]

    Secerbegovic A, Bergsland J, Halvorsen PS, Suljanovic N, Mujcic A, Balsasingham I. Blood pressure estimation using video photoplethysmography. In: 13th Int. Symp. Biomed. Imaging, (Prague, Czech Republic), pp. 461–4, 2016. Google Scholar

  • [167]

    Sugita N, Obara K, Yoshizawa M, Abe M, Tanaka A, Homma N. Techniques for estimating blood pressure variation using video images. in Conf. Proc. ... Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. IEEE Eng. Med. Biol. Soc. Annu. Conf., vol. 2015, Milan, Italy, pp. 4218–21, Aug 2015. Google Scholar

  • [168]

    Zaunseder S, Trumpp A, Ernst H, Förster M, Malberg H. Spatio-temporal analysis of blood perfusion by imaging photoplethysmography. In: Coté GL, editor. Opt. Diagnostics Sens. XVIII Towar. Point-of-Care Diagnostics p. 32, SPIE, San Francisco, California, United States, Feb 2018. Google Scholar

  • [169]

    Wieringa FP, Mastik F, van der Steen AFW. Contactless multiple wavelength photoplethysmographic imaging: a first step toward “SpO2 camera” technology. Ann Biomed Eng 2005;33:1034–41. Google Scholar

  • [170]

    Frassineti L, Giardini F, Perrella A, Sorelli M, Sacconi L, Bocchi L. Evaluation of spatial distribution of skin blood flow using optical imaging. In: C. 2017 Proc. Int. Conf. Med. Biol. Eng. 2017, pp. 74–80, 2017. Google Scholar

  • [171]

    Wang W, Stuijk S, de Haan G. Living-skin classification via remote-PPG. IEEE Trans Biomed Eng 2017;9294:1–1. Google Scholar

  • [172]

    Nowara EM, Sabharwal A, Veeraraghavan A. PPGSecure: biometric presentation attack detection using photopletysmograms. In: 2017 12th IEEE Int. Conf. Autom. Face Gesture Recognit. (FG 2017), pp. 56–62, IEEE, May 2017. Google Scholar

  • [173]

    Lakshminarayana NN, Narayan N, Napp N, Setlur S, Govindaraju V. A discriminative spatio-temporal mapping of face for liveness detection. In: 2017 IEEE Int. Conf. Identity, Secur. Behav. Anal., New Delhi, India, pp. 1–7, IEEE, Feb 2017. Google Scholar

  • [174]

    Thatcher JE, Li W, Rodriguez-Vaqueiro Y, Squiers JJ, Mo W, Lu Y, et al. Multispectral and photoplethysmography optical imaging techniques identify important tissue characteristics in an animal model of tangential burn excision. J Burn Care Res 2016;37:38–52. Google Scholar

  • [175]

    Balakrishnan G, Durand F, Guttag J. Detecting pulse from head motions in video. Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit 2013:3430–7. Google Scholar

  • [176]

    Hassan MA, Malik AS, Fofi D, Saad NM, Ali YS, Meriaudeau F. Video-based heartbeat rate measuring method using ballistocardiography. IEEE Sens J 2017;17:4544–57. Google Scholar

  • [177]

    Vogt E, Macquarrie D, Neary JP. Using ballistocardiography to measure cardiac performance: a brief review of its history and future significance. Clin Physiol Funct Imaging 2012;32:415–20. Google Scholar

  • [178]

    Rubīns U, Spīgulis J, Miščuks A. Photoplethysmography imaging algorithm for continuous monitoring of regional anesthesia. In: Proc. 14th ACM/IEEE Symp. Embed. Syst. Real-Time Multimed. – ESTIMedia’16, New York, NY, USA, pp. 67–71, ACM Press, 2016. Google Scholar

  • [179]

    Kamshilin AA, Margaryants NB. Origin of photoplethysmographic waveform at green light. In: Phys Procedia 2017;86:72–80. The Author(s). Google Scholar

  • [180]

    Tsoy MO, Rogatina KV, Stiukhina ES, Postnov DE. The assessment of sympathetic activity using iPPG based inter-limb coherence measurements. In: Derbov VL, Postnov DE, editors. Fourth Int. Symp. Opt. Biophotonics, vol. 10337, Saratov, Russian Federation, p. 1033718, SPIE, Apr 2017. Google Scholar

  • [181]

    Volkov MV, Margaryants NB, Potemkin AV, Volynsky MA, Gurov IP, Mamontov OV, et al. Video capillaroscopy clarifies mechanism of the photoplethysmographic waveform appearance. Sci Rep 2017;7:1–8. Google Scholar

  • [182]

    Abbas G, Khan MJ, Qureshi R, Khurshid K. Scope of video magnification in human pulse rate estimation. In: 2017 Int. Conf. Mach. Vis. Inf. Technol., pp. 69–75, IEEE, Feb 2017. Google Scholar

  • [183]

    Blackford EB, Estepp JR. Measurements of pulse rate using long-range imaging photoplethysmography and sunlight illumination outdoors. In: Coté GL, editor. Proc. SPIE 10072, Opt. Diagnostics Sens. XVII Towar. Point-of-Care Diagnostics, San Francisco, California, United States 2017;10072:100720S. Google Scholar

  • [184]

    Das K, Ali S, Otsu K, Fukuda H, Lam A, Kobayashi Y, et al. Detecting inner emotions from video based heart rate sensing. In: Huang D-S, Han K, Hussain A, editors. Int. Conf. Intell. Comput, vol. 9773 of Lecture Notes in Computer Science. Cham: Springer International Publishing; 2017: 48–57. Google Scholar

  • [185]

    Fallet S, Moser V, Braun F, Vesin J-M. Imaging photoplethysmography: what are the best locations on the face to estimate heart rate? In: Comput Cardiol 2016;2010:341–4. Google Scholar

  • [186]

    Feng L, Po L-M, Xu X, Li Y, Ma R. Motion-resistant remote imaging photoplethysmography based on the optical properties of skin. IEEE Trans Circuits Syst Video Technol 2015;25:879–91. Google Scholar

  • [187]

    Fernandes SL, Gurupur VP, Sunder NR, Arunkumar N, Kadry S. A novel nonintrusive decision support approach for heart rate measurement. Pattern Recognit Lett 2017;0:1–9. Google Scholar

  • [188]

    Fletcher RR, Chamberlain D, Paggi N, Deng X. Implementation of smart phone video plethysmography and dependence on lighting parameters. Conf. Proc. ... Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. IEEE Eng. Med. Biol. Soc. Annu. Conf., 2015;2015:3747–50. Google Scholar

  • [189]

    van Gastel M, Zinger S, Kemps H, de With P. e-Health video system for performance analysis in heart revalidation cycling. In: 2014 IEEE Fourth Int. Conf. Consum. Electron. Berlin, pp. 31–5, IEEE, sep 2014. Google Scholar

  • [190]

    Haque MA, Nasrollahi K, Moeslund TB. Efficient contactless heartbeat rate measurement for health monitoring. Int J Integr Care 2015;15:1–2. Google Scholar

  • [191]

    Kopeliovich M, Petrushan M, Samarin A. Evolutionary algorithm for structural-parametric optimization of the remote photoplethysmography method. Opt Mem Neural Networks 2017;26: (accepted for publication). Google Scholar

  • [192]

    Kumar M, Veeraraghavan A, Sabharwal A. Contact-free camera measurements of vital signs. SPIE Newsroom 2015. Google Scholar

  • [193]

    Lin J, Rozado D, Duenser A. Improving video based heart rate monitoring. Stud Health Technol Inform 2015;214:146–51. Google Scholar

  • [194]

    Lin Y-C, Lin Y-H. A study of color illumination effect on the SNR of rPPG signals. In: 2017 39th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., Seogwipo, South Korea, pp. 4301–4, IEEE, Jul 2017. Google Scholar

  • [195]

    Mannapperuma K, Holton B, Lesniewski P, Thomas J. Performance limits of ICA-based heart rate identification techniques in imaging photoplethysmography. Physiol Meas 2015;36:67–83. Google Scholar

  • [196]

    Pilz CS, Krajewski J, Blazek V. On the diffusion process for heart rate estimation from face videos under realistic conditions. In: Pattern Recognit 2017:361–73. Google Scholar

  • [197]

    Przybyło J, Kańtoch E, Jabłoński M, Augustyniak P. Distant measurement of plethysmographic signal in various lighting conditions using configurable frame-rate camera. Metrol Meas Syst 2016;23:579–92. Google Scholar

  • [198]

    Seki A, Quan C, Luo Z. Non-contact, real-time monitoring of heart rate with a Webcam with application during water-bed massage. In: 2016 IEEE Int. Conf. Robot. Biomimetics, Qingdao, China, pp. 1703–8, IEEE, Dec 2016. Google Scholar

  • [199]

    Tasli HE, Gudi A, den Uyl M. Remote PPG based vital sign measurement using adaptive facial regions. In: 2014 IEEE Int. Conf. Image Process., Paris, France, pp. 1410–4, IEEE, Oct 2014. Google Scholar

  • [200]

    Unakafov AM. Pulse rate estimation using imaging photoplethysmography: generic framework and comparison of methods on a publicly available dataset. Biomed. Phys. Eng. Express 2018;4:045001. Google Scholar

  • [201]

    Wang W, den Brinker AC, Stuijk S, de Haan G. Robust heart rate from fitness videos. Physiol Meas 2017;38:1023–44. Google Scholar

  • [202]

    Wang W, den Brinker AC, Stuijk S, de Haan G. Color-Distortion Filtering for Remote Photoplethysmography. In: 2017 12th IEEE Int. Conf. Autom. Face Gesture Recognit. (FG 2017), Washington, DC, USA, pp. 71–8, IEEE, May 2017. Google Scholar

  • [203]

    Waqar M, Tiddeman B. Advancements in contact-free heart rate measurements using human face videos. In: Proc. 7th UK Br. Mach. Vis. Work., no. 7, (Swansea, UK), pp. 1–9, 2015. Google Scholar

  • [204]

    Wei B, Zhang C, Wu X. Comprehensive comparison study on different ICA/BSS methods in IPPG techniques for obtaining high-quality BVP signal. In: Proc. 2016 Int. Conf. Intell. Inf. Process. - ICIIP ’16, Washington, DC, USA, pp. 1–6, 2016. Google Scholar

  • [205]

    Wiede C, Richter J, Hirtz G. Signal fusion based on intensity and motion variations for remote heart rate determination. In: 2016 IEEE Int. Conf. Imaging Syst. Tech., Chania, Greece, pp. 526–31, IEEE, Oct 2016. Google Scholar

  • [206]

    Wu B-F, Chu Y-W, Huang P-W, Chung M-L, Lin T-M. A motion robust remote-PPG approach to driver’s health state monitoring. In: Comput. Vis. ACCV 2016 Work., pp. 463–76, 2017. Google Scholar

  • [207]

    Yan BP, Chan CK, Li CK, To OT, Lai WH, Tse G, et al. Resting and postexercise heart rate detection from fingertip and facial photoplethysmography using a smartphone camera: a validation study. JMIR mHealth uHealth 2017;5:e33. Google Scholar

  • [208]

    Zheng J, Hu S, Echiadis AS, Azorin-Peris V, Shi P, Chouliaras V. A remote approach to measure blood perfusion from the human face. In: Mahadevan-Jansen A, Vo-Dinh T, Grundfest WS, editors. Proc. SPIE, San Jose, California, United States 2009;7169:716917. Google Scholar

  • [209]

    Kaur B, Hutchinson JA, Ikonomidou VN. Visible spectrum-based non-contact HRV and dPTT for stress detection. In: Agaian SS, Jassim SA, editors. Proc. Vol. 10221, Mob. Multimedia/Image Process. Secur. Appl. 2017, vol. 10221, Anaheim, California, United States, p. 102210E, May 2017. Google Scholar

  • [210]

    Spicher N, Kukuk M, Maderwald S, Ladd ME. Initial evaluation of prospective cardiac triggering using photoplethysmography signals recorded with a video camera compared to pulse oximetry and electrocardiography at 7T MRI. Biomed Eng Online 2016;15:126. Google Scholar

  • [211]

    Fan Q, Li K. Non-contact remote estimation of cardiovascular parameters. Biomed Signal Process Control 2018;40:192–203. Google Scholar

  • [212]

    Nishidate I, Tanaka N, Kawase T, Maeda T, Yuasa T, Aizu Y, et al. Noninvasive imaging of human skin hemodynamics using a digital red-green-blue camera. Biomed Opt 2011;16:086012. Google Scholar

  • [213]

    Van Gastel M, Stuijk S, De Haan G. Camera-based pulse-oximetry – validated risks and opportunities from theoretical analysis. Biomed Opt Express 2018;9:102–19. Google Scholar

  • [214]

    Trumpp A, Rasche S, Wedekind D, Rudolf M, Malberg H, Matschke K, et al. Relation between pulse pressure and the pulsation strength in camera-based photoplethysmograms. Curr Dir Biomed Eng 2017;3:489–92. Google Scholar

  • [215]

    Mamontov OV, Krasnikova T, Volynsky M, Shlyakhto E, Kamshilin AA. Position-dependent changes of blood flow in carotid arteries assessed by camera-based photoplethysmography. Clin Auton Res 2017;27:323–4. Google Scholar

  • [216]

    Al-Naji A, Perera AG, Chahl J. Remote monitoring of cardiorespiratory signals from a hovering unmanned aerial vehicle. Biomed Eng Online 2017;16:101. Google Scholar

  • [217]

    Blocher T, Schneider J, Schinle M, Stork W. An online PPGI approach for camera based heart rate monitoring using beat-to-beat detection in 2017 IEEE Sensors Appl. Symp., Glassboro, NJ, USA, pp. 1–6, IEEE, 2017. Google Scholar

  • [218]

    Zhao F, Li M, Qian Y, Tsien JZ. Remote measurements of heart and respiration rates for telemedicine. PLoS One 2013;8:e71384. Google Scholar

  • [219]

    Giannakakis G, Pediaditis M, Manousos D, Kazantzaki E, Chiarugi F, Simos PG, et al. Stress and anxiety detection using facial cues from videos. Biomed Signal Process Control 2017;31:89–101. Google Scholar

  • [220]

    Zhang C, Wu X, Zhang L, He X, Lv Z. Simultaneous detection of blink and heart rate using multi-channel ICA from smart phone videos. Biomed Signal Process Control 2017;33:189–200. Google Scholar

  • [221]

    Bousefsaf F, Maaoui C, Pruski A. Remote assessment of physiological parameters by non-contact technologies to quantify and detect mental stress states. In: 2014 Int. Conf. Control. Decis. Inf. Technol., Metz, France, pp. 719–23, IEEE, Nov 2014. Google Scholar

  • [222]

    Spicher N, Maderwald S, Ladd ME, Kukuk M. Heart rate monitoring in ultra-high-field MRI using frequency information obtained from video signals of the human skin compared to electrocardiography and pulse oximetry. Curr Dir Biomed Eng 2015;1:69–72. Google Scholar

  • [223]

    Rubins U, Miscuks A, Lange M. Simple and convenient remote photoplethysmography system for monitoring regional anesthesia effectiveness. In: EMBEC NBC 2017, Tampere, Finland, pp. 378–81, 2018. Google Scholar


  • 1

    In fact the technique yields pulse rate, but as common in the current literature we will use the term heart rate throughout this article. 

  • 2

    Note that most often only compressed data are available, which implies some limitations but still allows for HR detection. 

  • 3

    Note that even window-based approaches have been applied as base for HRV analysis (e.g. [137]) but beat detection is much more common. 

About the article

Received: 2017-07-13

Accepted: 2018-05-04

Published Online: 2018-06-13

Published in Print: 2018-10-25

Author Statement

Conflict of interest: Authors state no conflict of interest.

Informed consent: Informed consent is not applicable. Ethical approval: The conducted research is not related to either human or animals use.

Citation Information: Biomedical Engineering / Biomedizinische Technik, Volume 63, Issue 5, Pages 617–634, ISSN (Online) 1862-278X, ISSN (Print) 0013-5585, DOI: https://doi.org/10.1515/bmt-2017-0119.

Export Citation

©2018 Sebastian Zaunseder et al., published by De Gruyter, Berlin/Boston. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. BY-NC-ND 4.0

Comments (0)

Please log in or register to comment.
Log in