Show Summary Details
More options …

# Nanophotonics

Editor-in-Chief: Sorger, Volker

12 Issues per year

CiteScore 2017: 6.57

IMPACT FACTOR 2017: 6.014
5-year IMPACT FACTOR: 7.020

In co-publication with Science Wise Publishing

Open Access
Online
ISSN
2192-8614
See all formats and pricing
More options …

# Ultrafast optical imaging technology: principles and applications of emerging methods

Hideharu Mikami
/ Liang Gao
• Department of Electrical and Computer Engineering, University of Illinois, Urbana-Champaign, Champaign, IL 61801, USA; and Beckman Institute for Advanced Science and Technology, University of Illinois, Urbana-Champaign, Champaign, IL 61801, United States of America
• Other articles by this author:
/ Keisuke Goda
• Corresponding author
• Department of Chemistry, University of Tokyo, Tokyo 113–0033, Japan
• Department of Electrical Engineering, University of California, Los Angeles, CA 90095, United States of America
• Japan Science and Technology Agency, Tokyo 102–0076, Japan
• Email
• Other articles by this author:
Published Online: 2016-10-20 | DOI: https://doi.org/10.1515/nanoph-2016-0026

## Abstract

High-speed optical imaging is an indispensable technology for blur-free observation of fast transient dynamics in virtually all areas including science, industry, defense, energy, and medicine. High temporal resolution is particularly important for microscopy as even a slow event appears to occur “fast” in a small field of view. Unfortunately, the shutter speed and frame rate of conventional cameras based on electronic image sensors are significantly constrained by their electrical operation and limited storage. Over the recent years, several unique and unconventional approaches to high-speed optical imaging have been reported to circumvent these technical challenges and achieve a frame rate and shutter speed far beyond what can be reached with the conventional image sensors. In this article, we review the concepts and principles of such ultrafast optical imaging methods, compare their advantages and disadvantages, and discuss an entirely new class of applications that are possible using them.

## 1 Introduction

High-speed optical imaging is an indispensable technology for blur-free observation of fast transient dynamics in virtually all areas including science, industry, defense, energy, and medicine [13]. The quest for higher frame rates dates back to 1878 when Eadweard Muybridge performed motion-picture photography with an array of cameras as he captured a galloping horse in a sequence of shots to determine whether the horse’s feet were all off the ground at once (Figure 1) [4]. In today’s scientific research, the ability to visualize dynamical processes is a critical requirement for uncovering and understanding the unknown or hidden laws of nature that govern the dynamics. Also, highspeed cameras are a common tool for industrial production and manufacturing, car crash testing, robotic vision, biomechanical analysis in sports, and capturing and tracking fast-moving objects in defense and help us see the fast dynamics in slow motion after recording them. By common definition, a high-speed camera is a device that can acquire image frames at a frame rate of 250 frames per second (fps), or higher, and record the frames on a storage medium [3].

Fig. 1

Motion-picture photography of a galloping horse performed by Eadweard Muybridge in 1878 (Image courtesy of Eadweard Muybridge). He used an array of cameras to obtain a sequence of shots to determine whether the horse’s feet were all off the ground at once.

High temporal resolution is particularly important for microscopic imaging. When an object under study is on the micrometer-to-nanometer scale, the required capability of a camera for seeing the object is not only high spatial resolution, but also high temporal resolution. The need for high temporal resolution comes from the principle that even a slow event appears to occur “fast” in a small length scale and can be found in a diverse range of fields such as photochemistry, plasma physics, microfluidic biotechnology, semiconductor physics, shockwave therapy, and neuroscience. For example, in order to observe an object that travels at a constant speed v with spatial resolution d without motion blur, the observer needs a temporal resolution of d/v, which means that for a smaller (finer) object, the required temporal resolution is higher. To observe a slowly moving object (v = 1 m/s) on the human time scale by imaging within spatial resolution of 1 μm, we need a high temporal resolution of (1 μm) / (1 m/s) = 1 μs, which corresponds to a high frame rate of 1 Mfps (Figure 2). Unfortunately, conventional cameras based on electronic image sensors, such as the charge-coupled device (CCD) and complementary metal-oxide-semiconductor (CMOS), fall short in providing such high frame rates under desirable imaging conditions, as their image acquisition speed is significantly constrained by their electrical operation and limited storage.

Fig. 2

Temporal resolution required as a function of spatial resolution. As the spatial resolution becomes higher, the temporal resolution for imaging of a moving object needs to be finer, even at a constant moving speed.

Over the recent years, several unique and unconventional approaches to high-speed optical imaging have been reported to circumvent these technical hurdles and achieve a frame rate and shutter speed far beyond what can be reached with the conventional image sensors [510]. Serial time-encoded amplified imaging / microscopy (STEAM) [5, 11] is a continuous imaging method with a built-in optical image amplifier that enables continuous image acquisition at a high frame rate of ~100 Mfps without sacrificing sensitivity. Sequentially timed all-optical mapping photography (STAMP) [6, 12] is an ultrafast burst imaging method capable of achieving motion-picture femtophotography at a frame rate of ~1 Tfps with its all-optical operation that overcomes the electrical limitations of the conventional image sensors. Single-shot multispectral tomography (SMT) [7] is a high-speed tomographic imaging method based on spectral multiplexing for simultaneous illumination of the imaging target at multiple angles of incidence. Fluorescence imaging by radiofrequency-tagged emission (FIRE) [9] is a high-speed fluorescence imaging method that achieves ~100 times higher frame rates than conventional methods by frequency-modulating the excitation light spatially and demodulating the fluorescence signal after detecting it with a sensitive single-pixel photodetector. Compressed ultrafast photography (CUP) [10] is an ultrafast receive-only streak imaging method that incorporates a compressive-sensing architecture into the data acquisition of a streak camera and consequently achieves two-dimensional (2D) receive-only image acquisition at a frame rate of ~100 Gfps.

In this article, we review these ultrafast imaging methods that have emerged over the last several years. Specifically, we discuss their key concepts and working principles, in particular how they overcome the limitations of the conventional cameras for higher frame rates and shutter speeds. We also compare their advantages and disadvantages and discuss an entirely new class of applications they have brought us. This review article is organized as follows. In Section 2, we discuss the principles and limitations of conventional high-speed optical imaging. In Section 3, we discuss the principles and applications of the emerging optical imaging methods (i.e., STEAM, STAMP, SMT, FIRE, and CUP). In Section 4, we compare these imaging methods. In Section 5, we conclude the article and discuss a future prospective of high-speed optical imaging.

## 2 Principles and limitations of conventional high-speed optical imaging

At the early development of high-speed photography, most film-based cameras rely on mechanical mechanisms, such as rotating a mirror [13] or a prism [14], to achieve fast imaging. In a typical rotatory-prism-based system (Figure 3A), a prism with multiple facets placed between the camera lens and film rotates at a high rate. To avoid motion blur, the film must move at a speed that is synchronized with the rotation of the prism, such that the deflected image remains relatively static with respect to the film when the prism rotates. The accumulation of photons on the film continues until the angle between the incident light and the prism exceeds a critical value, which essentially shutters the view and stops photon integration. This process repeats as the next facet of the prism starts to transmit the subsequent temporal frame of the event onto the film. Due to the physical limitation on the rate at which a film can be transported through a camera, the maximum frame rate of a rotatory prim camera is approximately 40 kfps [3].

Fig. 3

Principles of film-based photography. (A) Rotatory-prism-based system. (B) Rotatory-mirror-based system.

To release the mechanical stress on the film and achieve a higher frame rate, a rotatory-mirror-based system retains a static film and rotates only the mirror. Figure 3B shows a representative system setup. The object is first imaged onto a mirror that spins at a high speed. The reflected light is then reimaged by an array of lenses onto the film, which is held stationary in an arc centered about the rotating mirror. The imaging speed, which is dependent on the rotating speed of the mirror, is up to 25 Mfps [3]. However, such cameras suffer from mechanical problems such as inertia and friction, which prevent the cameras from operating at a higher frame rate. In addition, due to its limited rotation angular range, a rotatory mirror camera can capture only a small number of consecutive frames (typically < 20 frames).

Since the invention of the CCD by Willard Boyle and George E. Smith at AT&T Bell Labs in 1969 [15], the advent of digital cameras has revolutionized high-speed photography and propelled rapid developments in the fields. In recent years, the CCD image sensor has gradually been replaced by the CMOS image sensor. The acquisition of a digital image by these sensors is generally based on the following scheme. The incident light is exposed to a 2D detector array within a given integration time. The image data is digitalized and then transferred to the host computer. For high-speed imaging, this process should be repeated as fast as possible. Provided that there are enough photons, the two bottlenecks that limit a digital camera’s frame rate are the camera’s digitalization bandwidth and data transfer speed.

High frame rates can be achieved by employing a high-bandwidth digitizer. Current consumer-grade digital cameras typically have only one digitizer on their chip. To digitalize an image with one million pixels, it typically takes ~10 ms. By integrating multiple digitizers on the same chip and employing a parallelization framework for the CMOS image-sensing operation, one can significantly increase the digitalization bandwidth of the camera. For example, the Phantom v2512 camera from Vision Research has achieved a 700 GHz digitalization bandwidth, which is approximately four orders of magnitude broader than that of a consumer-grade camera with a single digitizer. Such high-speed CMOS cameras are commonly used in industrial and military applications.

However, even with unlimited digitalization bandwidth, the frame rate of a digital camera is still limited by the data transfer speed from the camera to the host computer. For example, using high-speed interfaces such as Camera Link, USB 3.0, or PCI express, a high-speed camera with one million pixels can stream digitalized images to a computer at only ~100 fps [16]. To circumvent the data transfer bandwidth limitation, a common strategy known as dubbed burst imaging is used to store the images locally on the camera’s chip and transfer them to the computer after data acquisition. By leveraging the on-chip memory, off-the-shelf high-speed cameras can record images with one million pixels at 7 kfps (e.g., the Fastcam SA5 camera from Photron). However, this type of high-speed camera normally requires a trigger signal to initialize recording. Additionally, the number of recorded temporal frames is strictly limited by the camera’s memory storage, resulting in a short observation window that typically lasts for a few seconds at best. The limited temporal window and the requirement for the trigger signal are serious handicaps in imaging unpredictable events, such as optical rogue waves (rare giant optical pulses as a result of nonlinear interactions in a microstructured fiber) [17], the onset of neuron firing (impulsive change in neuronal action potential in the brain) [18], and circulating tumor cells (rare metastatic cancer cells that circulate in the bloodstream and seed secondary tumors) [19] – phenomena in which nothing occurs before the event of interest that can be used to trigger the beginning of recording.

Additionally, one must take into consideration a fundamental limit on the image quality imposed by high frame rates. Because the detector’s thermal noise and shot noise are proportional to the root square of signal bandwidth [20], the broader the bandwidth, the noisier the image. Therefore, a higher frame rate is generally accompanied by a higher noise level and lower signal-to-noise ratio (SNR). For a given application, one must balance the camera’s frame rate for a desired SNR.

## 3.1 Serial time-encoded amplified imaging / microscopy (STEAM)

STEAM [5, 11] is an ultrafast continuous imaging method that runs at a frame rate of ~100 Mfps and schematically shown in Figure 4A. A broadband pulse from the mode-locked femtosecond pulse laser first enters the 2D spatial disperser that maps the spectrum of the pulse into a 2D rainbow. The rainbow is incident onto the object in such a way that different frequency components of the rainbow (pulse) illuminate different spatial coordinates on the object. The 2D spatial profile of the object is encoded in the spectrum of the reflected rainbow, which returns to the same spatial disperser that recombines the rainbow back into a broadband pulse. The pulse then enters the amplified dispersive Fourier transformer for mapping the spectrum of the image-encoded pulse into a temporal waveform and optically amplifying it before detecting it with a single-pixel photodetector. In other words, the image is serialized and amplified in the optical domain. The amplified dispersive Fourier transformer consists of a low-loss dispersive fiber with a large amount of group-velocity dispersion, which is optically pumped by continuous-wave (CW) lasers for distributed Raman amplification in the fiber via stimulated Raman scattering. Resembling the time-domain signal in an optical communication link, the optically amplified spatial profile appears as a time-modulated waveform and is captured by a single-pixel photodetector and digitized by an analog-to-digital converter. The one-dimensional (1D) digitized signal is mapped back into a 2D profile in the digital domain for image construction. The pulse is repeated for repetitive imaging, such that the pulse repetition rate of the laser is equivalent to the frame rate of STEAM. Consequently, STEAM achieves a shutter speed of ~100 ps, a frame rate of ~100 Mfps, and an optical image gain of ~30 dB.

Fig. 4

STEAM. (A) Schematic. (B) Surface inspection with STEAM. (C) Surface vibrometry with STEAM. (D) Cancer detection with STEAM. ©Figure courtesy of Ref. [23, 24, 26, 27].

While STEAM is an ultrafast imager that can be used in diverse applications, its notable strength is, as opposed to other ultrafast imaging methods such as STAMP and CUP, its ability to continuously acquire images without a limit. Ultrafast imaging has traditionally been used to observe fast transient dynamics in a fixed object (e.g., explosions and chemical reactions) [2, 21], but can also be used for monitoring, evaluating, and characterizing a large number of objects in a short period of time for high-precision statistical analysis. For this reason, STEAM has been found the most useful in applications that require fast continuous monitoring, such as surface inspection of parts for manufacturing and quality control [22, 23], surface vibrometry for diagnosing aerospace components and music instruments [24, 25], and flow cytometry of biological cells [26, 27].

First, STEAM has been used to perform high-speed surface inspection for detection of micrometer-sized surface defects that travel at a speed as high as ~3 m/s (Figure 4B) [22, 23]. In this inspector, a modified STEAM camera is employed to illuminate fast-moving objects with temporally and spatially dispersed pulses and detect scattered light from defects on their surfaces (e.g., a nonreflective black film, transparent flexible film, and reflective hard disk) with a sensitive photodetector in both bright-field and dark-field configurations. Consequently, the surface inspector provides nearly 1000 times higher scanning speed than conventional inspectors. This inspector is expected to improve the cost and performance of organic light-emitting diode displays for next-generation smart phones, lithium-ion batteries for green electronics, and high-efficiency solar cells.

Also, STEAM has been set up in an interferometric configuration and used as a surface vibrometer for realtime observation of fast nanomechanical surface vibrations with sub-nanometer axial resolution [24, 25]. In general, high-speed surface vibrometry is important for nondestructive diagnosis of mechanical components, but its real-time operation has been difficult due to the speed limitations of laser scanners. While stroboscopic imaging can be used to evaluate the dynamics of mechanical systems faster than the frame rate of a camera, it requires their motion to be repetitive (i.e., vibrating, rotating, oscillating, or reciprocating), unable to monitor random or nonrepetitive dynamics. Specifically, the STEAM surface vibrometer was used to monitor the surface profile of a fast-vibrating diaphragm with an oscillation frequency of 1 kHz at a frame rate of 105.4 kfps (Figure 4C).

Finally, STEAM has been integrated with a flow cytometer and high-speed digital signal processor to perform high-throughput image cytometry at a record high throughput of 100, 000 cells/s [26, 27]. The technique was used to identify rare cancer cells in a large heterogeneous population of blood cells with an unprecedented low false-positive rate of one in a million. Figure 4D shows a library of STEAM images. This method is effective for high-throughput screening of an extremely small number (~10 per mL of blood) of circulating tumor cells – precursors to cancer metastasis that cause 90% of cancer deaths. It can analyze 10 mL of lysed blood in less than 15 minutes and holds promise for early, noninvasive, low-cost detection of cancer and evaluation of chemotherapy.

## 3.2 Sequentially timed all-optical mapping photography (STAMP)

STAMP [6, 12] is an ultrafast burst imaging method that achieves a record high frame rate of ~1 Tfps. The principle of STAMP is based on the projection of the target’s time-varying spatial profile onto the spatial domain. This is enabled by using spatial and temporal dispersion to spatially and temporally separate the time-varying event, respectively. Since the spatial and temporal dispersion is an optical effect, it eliminates the speed bottleneck that exists in the conventional camera’s mechanical and electrical operation in its motion-picture acquisition process.

STAMP is schematically shown in Figure 5A. The STAMP camera is composed of five components: an ultrashort laser source, temporal mapping device (TMD), spatial mapping device (SMD), image sensor, and computer. First, an ultrashort laser pulse emitted from the laser source is temporally stretched and shaped by the TMD into several daughter pulses of different spectral bands. The laser source is a Ti:Sapphire femtosecond pulse laser with a chirped pulse amplifier that emits a pulse train with a repetition rate of 1 kHz, a center wavelength of 810 nm, and a pulse width of 70 fs. An optical chopper and mechanical shutter are used to pick a single pulse. The single pulse is split into a pulse for STAMP imaging and an excitation pulse for triggering an ultrafast event. The TMD consists of a pulse stretcher (e.g., a grating pair and glass rod) and a pulse shaper (e.g., a 4f lens system with a spatial light modulator) that tailors the temporal shape and intensity of each daughter pulse to be identical such that the successive “flashes” are the same except for the timing of their arrival at the target. The image-encoded daughter pulses are spatially separated by the SMD that consists of a diffraction grating, periscope array, and lenses while satisfying the requirement for image formation on the image sensor without image deformation. The recorded images on the image sensor are digitally processed and re-combined, but in the time domain, for producing a motion picture out of the temporally stacked image frames based on the precalibrated relations between time and the optical wavelength and between the optical wavelength and the spatial coordinates on the detector. The frame rate and exposure time of the STAMP camera can be determined by tuning the temporal dispersion in the TMD.

Fig. 5

STAMP. (A) Schematic. (B) Imaging of plasma dynamics with STAMP. (C) Imaging of phonon dynamics with STAMP. ©Figure courtesy of Ref. [6].

STAMP has been used to monitor the early-stage plasma dynamics of a femtosecond-laser-induced ablation process on the picosecond time scale (Figure 5B). Here an excitation pulse with a pulse energy of 100 μJ and pulse duration of 70 fs was focused on the surface of a glass plate for ablating it. As shown in the figure, plasma dynamics was visualized with the STAMP camera in a shadowgraph mode with a frame interval of 15.3 ps (corresponding to 65.4 Gfps) and exposure time of 13.8 ps. As shown in the image sequence or movie, the plasma plume generated by femtosecond laser expands rapidly in the radial direction. From the movie, the speed of the plume front was found to be approximately 105 m/s.

Also, STAMP has been used to observe phonon dynamics in a crystal on the femtosecond time scale (Figure 5C). Here an excitation pulse with a pulse energy of 40 μJ and pulse duration of 70 fs was line-focused into a LiNbO3 wafer to produce a phonon-polariton pulse via impulsive stimulated Raman scattering. The pulse formation was visualized in a polarization-gating mode and captured with a frame interval of 812 fs (corresponding to 1.23 Tfps) and exposure time of 1020 fs. A STAMP movie of the phonon dynamics is shown in the figure. It indicates that the intense electromagnetic field of the laser pulse excites lattice vibrations in the crystal and then a phonon-polariton wave-packet is formed from a seemingly random response of the crystal and propagates away from the excitation region. The figure shows the wave-packet propagation with a finer frame interval of 229 fs (corresponding to 4.37 Tfps). The speed of the phonon-polariton wave-packet was determined from acquired images to be 4.6 × 107 m/s (one-sixth of the speed of light).

## 3.3 Single-shot multispectral tomography (SMT)

SMT [7] is a nonscanning tomographic imaging method that enables single-shot cross-sectional image acquisition on a femtosecond time scale. This method is based on spectral multiplexing of a single laser beam to conduct tomographic imaging over a continuous range of illumination angles simultaneously. Specifically, spectral multiplexing for position encoding is performed by focusing a spectrally broadband laser beam with angular dispersion, resulting in a linear array of point light sources, each of which has a different wavelength (Figure 6A). The spectrum of the transmitted light on which the imaging target’s spatial profile is encoded is measured by a spectrometer, which consists of a spectral disperser such as a diffraction grating and a 2D image sensor (e.g., a CCD or CMOS image sensor) on which its spectral interference with a foregoing reference ultrashort laser pulse is recorded. Different wavelength components of the transmitted light are mapped onto different positions on the image sensor. This process enables a simultaneous measurement of the multiple fan beam transmissions. To show its utility, laser-induced plasma generated by a high-intensity ultrashort laser pulse has been observed with SMT with a temporal resolution of 12 ps (Figure 6B).

Fig. 6

Ultrafast tomographic imaging. (A) Schematic of SMT. (B) Imaging of plasma dynamics with SMT. (C) Schematic of tomographic imaging with the FDT. (D) Imaging of light pulse propagation with the FDT. ©Figure courtesy of Ref. [7] and [8].

SMT’s spatiotemporal resolution is determined by several factors. The temporal resolution is constrained by the time required for the probe beams to traverse the imaging object, the arrival time ambiguity of the probe beams reaching the object, and the coherence time of the beams. In the previous demonstration of SMT [7], the second factor is dominant, resulting in the temporal resolution of 12 ps. On the other hand, the spatial resolution of SMT is limited by the divergence angle of each fan beam. This situation is analogous to the longitudinal spatial resolution of conventional wide-field microscopy given by ∼λ/NA2, where λ is the optical wavelength and NA is the numerical aperture of the objective lens used for imaging. In the previous demonstration [7], the NA was relatively small, resulting in limited spatial resolution, in particular in the longitudinal direction.

Recently, a frequency-domain tomography (FDT) [8, 28] has been used to overcome SMT’s limitation that while SMT can acquire a tomographic image in a single shot, it cannot perform ultrafast repetitive tomographic image acquisition due to the limited scan rate of the spectrometer (more specifically, the CCD image sensor in the spectrometer), which is typically 100 to 10 kHz. In this method, a chirped light beam is irradiated onto an object at certain incidence angles and their spatiotemporal profile is obtained by spectral interference (Figure 6C) [29]. By virtue of chirping, wavelength components of the laser light are located at different longitudinal positions at a given time, which results in the one-to-one relation between the wavelength components of the laser light and the longitudinal positions of the object. By obtaining a FDT image with multiple beams that illuminate the object at different incidence angles simultaneously, tomographic reconstruction from a single-shot data set is made possible. With this method, a movie of light pulse propagation in a glass medium has been obtained by taking advantage of the nonlinear Kerr effect (Figure 6D). This method significantly extends the capability of SMT. However, due to the limited number of illumination angles, the spatial resolution of this method is limited to imaging of simple-structured objects by the use of tomographic illumination at discrete sampling angles.

## 3.4 Fluorescence imaging by radiofrequency-tagged emission (FIRE)

Fluorescence microscopy is an essential tool for biological and medical research. While efforts to achieve higher spatial resolution in fluorescence imaging are important as recognized by the 2014 Nobel Prize in Chemistry, it has also been a major technical challenge to observe fast transient phenomena in biological samples such as calcium and metabolic waves in live cells [30, 31], action potential sequences in a large network of neurons [3234], and calcium release correlations and signaling in cardiac tissue [35]. High-speed fluorescence imaging is also needed for high-throughput image cytometry of a large number of cells [9, 26].

To address the high demand and the speed-sensitivity trade-off in conventional fluorescence microscopy, a highspeed fluorescence imaging method known as FIRE has recently been developed. The key concept of FIRE is a combination of frequency multiplexing and spectral mapping in the radiofrequency band in analogy with the optical frequency band used in STEAM to encode the spatial profile of fluorescence light into the radiofrequency spectrum. FIRE can also be seen as a high-speed version of fluorescence confocal microscopy with a parallelized readout, enabling sub-millisecond temporal resolution and a frame rate of more than ~1 kfps in fluorescence microscopy [9]. By virtue of this high frame rate, FIRE has been applied to flow cytometry to image every single cell flowing at a speed of 1 m/s, corresponding to a high throughput of 25, 000 cells/s.

A schematic of FIRE is shown in Figure 7A. Excitation light beams are generated in an interferometric setup. An acousto-optic deflector (AOD) driven by an electric frequency comb signal generates deflected beams, each of which has a characteristic deflection angle and optical frequency shift equal to each frequency component in the driving signal (Figure 7B). The deflected beams interfere with another beam having the original optical frequency, which results in intensity modulation of the deflected beams due to the optical beating between two interfering light beams. These beams with radiofrequency modulations (10–100 MHz) are focused onto a sample via an objective lens (Figure 7B). The focal spots are aligned in a line due to the different deflection angles. Florescence light generated from the spots is detected by a sensitive single-pixel photodetector such as a photomultiplier tube. A fluorescence signal from each of the focal spots is extracted from the detected signal by digital signal processing. A fluorescence image of the sample is reconstructed from the extracted signals (Figure 7C). More concretely, the detected signal is expressed as $s(t)=∑i=1npi(t)Io[1+cos⁡(2πfit+δi)]$(1)

Fig. 7

FIRE. (A) Schematic. (B) Principle of radiofrequency encoding. (C) Principle of the image reconstruction. (D) Fluorescence imaging of cells with FIRE in comparison to conventional wide-field microscopy. ©Figure courtesy of Ref. [9].

where n is number of focal spots, pi(t) is fluorescence signal obtained when there is no temporal modulation in excitation light, I0, fi, and δi are the average intensity level, beat frequency, and phase offset of each of the excitation beams, respectively. A Fourier transform of the signal is expressed as $s(f)=∫−∞∞s(t)e−2πiftdt=∑k=1npk(f)+12∑k=1npk(f−fk)eiδk+12∑k=1npk(f−fk)e−iδk$(2)

where Pi(f) is the Fourier transform of Pi(t)Io. Thus, as long as the signal bandwidths are less than half of the frequency spacing of excitation light beams fi+1 - fi, the signal components Pi(f - fi) can be individually extracted as they are clearly separated in the frequency domain. Fluorescence images of cells obtained with FIRE are shown in Figure 7D. Although similar ideas of frequency multiplexing in fluorescence imaging have been reported previously, they have much slower modulation speeds than FIRE because mechanical scanning is used for modulation [3639].

FIRE’s specifications are characterized as follows. The spatial resolution of FIRE is in between those of fluorescence confocal microscopy and wide-field microscopy. It is known that the spatial resolution of confocal microscopy is improved by placing an aperture in front of a detector and a pinhole aperture (used in a standard confocal microscope) provides higher resolution than a slit aperture (used in FIRE) [40]. Since it has a sectioning capability, its extension to three-dimensional imaging is possible. The temporal resolution of FIRE is given by the reciprocal of its frame rate (demonstrated to be ~230 μs). It is important to note that fluorescence microscopy, in general, suffers from a much weaker signal level than that of scattering-or diffraction-based microscopy, which results in a lower frame rate than other ultrafast imaging techniques such as STEAM, STAMP, and CUP.

## 3.5 Compressed ultrafast photography (CUP)

Based on computational imaging, CUP [10] breaks the digitalization bandwidth limitation by incorporating a compressed-sensing architecture into the dataacquisition of a streak camera. Conventionally, a streak camera is a 1D ultrafast imaging device [41] in which its narrow entrance slit (typically 10–50 μm wide) limits the imaging field of view to a line. Upon arrival, the incident photons are converted into photoelectrons on a photocathode. Then, the photoelectrons are deflected by a sweeping voltage applied along the vertical axis. The amount of deflection is determined by the arrival time of the incident photons. Finally, the photoelectrons are reconverted back to photons on a phosphor screen and measured by a 2D detector array such as a CCD image sensor. In the final output 2D image, the event’s spatial information is contained in the pixels along the horizontal axis while the temporal information is contained in the pixels along the vertical axis.

Leveraging the receive-only advantage, streak cameras have been widely used in a variety of applications such as fluorescence lifetime imaging microscopy [42, 43], ultrafast photography [44, 45], and time-resolved spectroscopy [46]. However, because the field of view (FOV) is limited to a line, to acquire a 2D image, a streak camera must scan in the direction that is perpendicular to its entrance slit. This requirement imposes the severe restriction on the applicable objects; the event itself must be repetitive at each scanning position. In cases where a physical phenomenon is either difficult or impossible to repeat, such as optical rogue waves [17] and laser-induced breakdown in a material [47], streak cameras are inapplicable.

An intuitive solution to this problem is to fully open the streak camera’s entrance slit and allow a 2D image as the input. However, because the temporal information also occupies the vertical axis, simply opening the streak camera’s entrance slit would introduce spatial-temporal crosstalk. By contrast, rather than directly imaging the scene, CUP first spatially encodes the input image with a random binary pattern and then projects the resultant image onto the streak camera. Mathematically, the image formation process in CUP is equivalent to successively applying a spatial encoding operator, C, and a temporal shearing operator, S, to the input dynamic scenes, I(x, y, t): $Is(x″,y″,t)=SCI(x,y,t)$(3)

where Is (x″, y″, t) denotes the resultant encoded, temporally sheared scenes. Next, I (x″, y″, t) is measured by a 2D detector array where the light is spatially integrated over each CCD pixel and temporally integrated over the camera’s exposure time. This detection process can be formulated as $E(mn)=TIs(x″,y″,t)$(4)

where T is the spatiotemporal integration operator. Combining Eq. (3) and Eq. (4) gives $E(m,n)=TSCI(x,y,t)$(5)

The image reconstruction solves the inverse problem of Eq. (5). Provided that there is sparsity in the spatiotemporal domain, the event datacube I can be reasonably estimated by using compressed sensing algorithms, such as Two-Step Iterative Shrinkage/Thresholding (TwIST) [48].

A typical CUP setup is shown in Figure 8A. The scenes are first imaged by a camera lens to an intermediate plane. This intermediate image is then relayed to a spatial encoding element, digital micro-mirror device (DMD), through a 4f system. The DMD consists of tens of thousands of micro-mirrors, each of which can be individual turned on (titled +12° with respect to the surface norm) and off (titled -12° with respect to the surface norm). A static, random binary pattern is displayed at the DMD. The light reflected from the “on” micro-mirrors is collected by a microscope objective and relayed to a streak camera with a fully-opened entrance port [~17 mm (width) x 5 mm (height)].

Fig. 8

CUP. (A) Schematic. (B) Imaging of light pulse refraction with CUP. (C) Imaging of the racing of two pulses with CUP. (D) Imaging of a fluorescence decay with CUP. (E) Fluorescence decay measurement with CUP. ©Figure courtesy of Ref. [10].

CUP has been used to perform 2D ultrafast imaging of a laser pulse (7 ps duration) reflected from a mirror and refracted from an air-resin interface (Figure 8B). This was the first time that the video recording of photon propagation can be accomplished using a single laser pulse within a single camera exposure. Moreover, the racing of two laser pulses (one propagated in the air and other propagated in the resin) has also been observed with CUP (Figure 8C). Due to the refractive index difference, the laser pulse propagated at a faster speed in the air than that in the resin. The calculated light speeds in the air and resin were consistent with theoretical values.

Furthermore, a CUP system with a spectral separation unit has been used to measure the time-lapse fluorescence decay after pulsed laser excitation. A fluorescent dye, Rhodamine 6G, was excited by a single 7 ps laser pulse at 532 nm. Representative temporal frames are shown in Figure 8D. In addition, based on the fluorescence decay measurement with CUP (Figure 8E), the calculated fluorescence lifetime, which was found to be 3.8 ns, agrees with a previously reported value [49]. Notably, although only two-color imaging was demonstrated, this approach permits upscaling to more spectral channels simply by adding more dichroic filters with evenly spaced cut-on wavelengths into the stack [50].

## 4 Discussion

In this section, we compare STEAM, STAMP, SMT, FDT, FIRE, and CUP discussed in Section 3. Table 1 shows their specifications. For comparison, the table also includes single-shot shadowgraphy (SS) with an ultrashort pulse laser as a conventional ultrafast imaging method, which can be considered as a special case of STAMP [51, 52]. Overall, SS, STEAM, SMT, and FIRE are continuous imaging methods that can continuously acquire image frames without a limit whereas STAMP, FDT, and CUP are burst imaging methods that can acquire image frames at a much higher rate than SS, STEAM, SMT, and FIRE, but with a limited total number of frames. For this reason, SS, STEAM, SMT, and FIRE are useful for evaluating and screening a large number of objects such as parts and biological cells while STAMP, FDT, and CUP are useful for studying fast transient dynamics in depth. As Table 1 shows, since there is no free lunch, all these imaging methods sacrifice one or more parameters of specificity to achieve ultrafast image acquisition.

Table 1

Comparison in terms of imaging capabilities between SS, STEAM, STAMP, SMT, FDT, FIRE, and CUP. SS, STEAM, SMT, and FIRE are continuous imaging methods while STAMP, FDT, and CUP are burst imaging methods.

The temporal resolution of these imaging methods is determined by different factors. The temporal resolution of STEAM, STAMP, and SMT are ultimately limited by the temporal duration of their single-color exposure within their simultaneous multicolor exposure. This implies that a trade-off relation exists between their temporal resolution and other imaging parameters such as the number of pixels in a single frame in the case of STEAM, the number of frames in the case of STAMP, and the field of view in the case of SMT. Therefore, employing pulses with a broader bandwidth can relax this trade-off, allowing for higher temporal resolution. The temporal resolution of FDT is mainly determined by the maximum incidence angle of the probe beam and can hence be improved with a large incidence angle. The temporal resolution of FIRE is in a trade-off relation with the number of focal spots. The trade-off can be relaxed by increasing the signal bandwidth. However, the available bandwidth is limited by the lifetime of a target fluorophore, which is typically a few nanoseconds. The temporal resolution of CUP is in a tradeoff relation with the number of pixels and spatial resolution in the streak direction. The upper limit of the temporal resolution is given by the temporal resolution of the streak camera.

Another important factor, which is not shown in the table, is spatial resolution for microscopy. All of these imaging methods can, in principle, achieve nearly diffraction-limited spatial resolution. Additionally, STEAM and FIRE are compatible with confocal microscopy by using a single-mode fiber and a slit aperture, respectively, which doubles their spatial resolution in the lateral direction and acquires spatial frequency components in the longitudinal direction. On the contrary, the spatial resolution of FDT is much lower than the diffraction limit due to its illumination at discrete angles of incidence, but this can be improved with a better design. Since CUP sacriices pixel resolution for the sake of compressive imaging, its spatial resolution is relatively low for microscopy. Finally, it is important to note that all these methods require a careful design of their optical systems in terms of chromatic aberration in order to achieve the highest possible spatial resolution.

The computational cost of these imaging methods also needs to be taken into account for some applications, in particular applications that require real-time digital processing of the massive number of image frames that are generated by their ultrafast image acquisition. SMT and FDT require inverse Radon transformation to construct to-mographic images. FIRE involves Fourier transformation for image processing. CUP requires a relatively high computational cost as it is inherently subject to optimization algorithms associated with compressive imaging. On the other hand, the computational cost of STEAM and STAMP is low since they can directly obtain final images without much computation.

Finally, we emphasize the importance of using pulse lasers as optical sources for ultrafast imaging. In general, as mentioned in Section 2, a trade-off relation exists between shutter speed (hence frame rate) and SNR in ultrafast imaging, meaning that increasing the frame rate inevitably results in a poorer SNR. However, since ultrashort laser pulses have very high peak powers, their energy is packed in a short exposure time, which enables a high signal level and thus a reasonable SNR. This merit applies to all the imaging methods discussed above except for CUP and FIRE which builds on a CW laser as an optical source.

CUP is a receive-only imaging method that does not require active illumination and is compatible with ultrafast laser light exposures [53].

## 5 Summary

In summary, we have reviewed the principles and applications of the emerging ultrafast optical imaging methods such as STEAM, STAMP, SMT, FIRE, and CUP. We have analyzed how they circumvent the limitations of the traditional image sensors for higher frame rates and shutter speeds. While these methods trade one or more parameters of specificity for a higher image acquisition speed, they stand complementary to each other. Since these imaging methods have been developed only recently, we expect them to further evolve in the next decade.

## Acknowledgment

This work was supported by ImPACT Program of the Council for Science, Technology and Innovation (Cabinet Office, Government of Japan), JSPS KAKENHI (grant numbers 25702024 and 25560190), JGCS Scholarship Foundation, Mitsubishi Foundation, Konica Minolta Imaging Science Encouragement Award, Takeda Science Foundation, Noguchi Shitagau Research Grant, Epson International Scholarship Foundation, Ogino Prize, and Advanced Photon Science Alliance of the Ministry of Education, Culture, Sports, Science, and Technology (MEXT).

## References

• [1]

Peres, M.R. (Ed.), The Focal Encyclopedia of Photography: Digital Imaging, Theory and Applications, History, and Science, Fourth Edi, Abingdon, Oxon, UK, Focal Press, 2007. Google Scholar

• [2]

Ray, S.F. (Ed.), High Speed Photography and Photonics, Bellingham, WA, USA, SPIE Press, 2002. Google Scholar

• [3]

Fuller, P.W.W., An introduction to high speed photography and photonics. Imaging Sci. J. 2009, 57, 293–302. Google Scholar

• [4]

Clegg, B., The man who stopped time: the illuminating story of Eadweard Muybridge?: pioneer photographer, father of the motion picture, murderer, Washington, D.C., USA, Joseph Henry Press, 2007. Google Scholar

• [5]

Goda, K., Tsia, K.K., Jalali, B., Serial time-encoded amplified imagingfor real-time observation of fast dynamic phenomena. Nature 2009, 458, 1145–1149. Google Scholar

• [6]

Nakagawa, K., Iwasaki, A., Oishi, Y., et al., Sequentially timed all-optical mapping photography (STAMP). Nat. Photonics 2014, 8, 695–700. Google Scholar

• [7]

Matlis, N.H., Axley, A., Leemans, W.P., Single-shot ultrafast tomographic imaging by spectral multiplexing. Nat. Commun. 2012, 3, 1111. Google Scholar

• [8]

Li, Z., Zgadzaj, R., Wang, X., Chang, Y.-Y., Downer, M.C., Single-shot tomographic movies of evolving light-velocity objects. Nat. Commun. 2014, 5, 3085. Google Scholar

• [9]

Diebold, E.D., Buckley, B.W., Gossett, D.R., Jalali, B., Digitally synthesized beat frequency multiplexing for sub-millisecond fluorescence microscopy. Nat. Photonics 2013, 7, 806–810. Google Scholar

• [10]

Gao, L., Liang, J., Li, C., Wang, L. V, Single-shot compressed ultrafast photography at one hundred billion frames per second. Nature 2014, 516, 74–77. Google Scholar

• [11]

Goda, K., Tsia, K.K., Jalali, B., Amplified dispersive Fourier-transform imaging for ultrafast displacement sensing and barcode reading. Appl. Phys. Lett. 2008, 93, 131109. Google Scholar

• [12]

Tamamitsu, M., Nakagawa, K., Horisaki, R., et al., Design for sequentially timed all-optical mapping photography with optimum temporal performance. Opt. Lett. 2015, 40, 633–6. Google Scholar

• [13]

Davis, W.C., A High-Speed Rotating-Mirror Framing Camera. Appl. Opt. 1962, 1, 407. Google Scholar

• [14]

Waddell, J.H., Rotating prism design for continuous image compensation cameras. Appl. Opt. 1966, 5, 1211–23. Google Scholar

• [15]

Boyle, W.S., Smith, G.E., Charge Coupled Semiconductor Devices. Bell Syst. Tech. J. 1970, 49, 587–593. Google Scholar

• [16]

Joubert, J., Sharma, D., Light microscopy digital imaging. Curr. Protoc. Cytom. 2011, 58, 2.3.1–2.3.11.. Google Scholar

• [17]

Solli, D.R., Ropers, C., Koonath, P., Jalali, B., Optical rogue waves. Nature 2007, 450, 1054–7. Google Scholar

• [18]

Stein, R.B., Gossen, E.R., Jones, K.E., Neuronal variability: noise or part of the signal? Nat. Rev. Neurosci. 2005, 6, 38997. Google Scholar

• [19]

Cristofanilli, M., Budd, G.T., Ellis, M.J., et al., Circulating tumor cells, disease progression, and survival in metastatic breast cancer. N. Engl. J. Med. 2004, 351, 781–91. Google Scholar

• [20]

Zmuidzinas, J., Thermal Noise and Correlations in Photon Detection. Appl. Opt. 2003, 42, 4989. Google Scholar

• [21]

Ihee, H., Lobastov, V.A., Gomez, U.M., et al., Direct imaging of transient molecular structures with ultrafast diffraction. Science 2001, 291, 458–62. Google Scholar

• [22]

Yazaki, A., Kim, C., Chan, J., et al., Ultrafast dark-field surface inspection with hybrid-dispersion laser scanning. Appl. Phys. Lett. 2014, 104, 251106. Google Scholar

• [23]

Chen, H., Wang, C., Yazaki, A., Kim, C., Goda, K., Jalali, B., Ultrafast web inspection with hybrid dispersion laser scanner. Appl. Opt. 2013, 52, 4072–6. Google Scholar

• [24]

Goda, K., Mahjoubfar, A., Wang, C., et al., Hybrid dispersion laser scanner. Sci. Rep. 2012, 2, 445. Google Scholar

• [25]

Mahjoubfar, A., Goda, K., Ayazi, A., Fard, A., Kim, S.H., Jalali, B., High-speed nanometer-resolved imaging vibrometer and velocimeter. Appl. Phys. Lett. 2011, 98, 101107. Google Scholar

• [26]

Goda, K., Ayazi, A., Gossett, D.R., et al., High-throughput single-microparticle imaging flow analyzer. Proc. Natl. Acad. Sci. 2012, 109, 11630–11635. Google Scholar

• [27]

Goda, K., Jalali, B., Dispersive Fourier transformation for fast continuous single-shot measurements. Nat. Photonics 2013, 7, 102–112. Google Scholar

• [28]

Li, Z., Zgadzaj, R., Wang, X., Reed, S., Dong, P., Downer, M.C., Frequency-domain streak camera for ultrafast imaging of evolving light-velocity objects. Opt. Lett. 2010, 35, 40874089. Google Scholar

• [29]

Tokunaga, E., Terasaki, A., Kobayashi, T., Frequency-domain interferometer for femtosecond time-resolved phase spectroscopy. Opt. Lett. 1992, 17, 1131–1133. Google Scholar

• [30]

Tsien, R.Y., Fluorescent indicators of ion concentrations. Methods Cell Biol. 1989, 30, 127–56. Google Scholar

• [31]

Petty, H.R., Spatiotemporal chemical dynamics in living cells: from information trafficking to cell physiology. Biosystems. 2006, 83, 217–24. Google Scholar

• [32]

Grinvald, A., Anglister, L., Freeman, J.A., Hildesheim, R., Manker, A., Real-time optical imaging of naturally evoked electrical activity in intact frog brain. Nature 1984, 308, 848850. Google Scholar

• [33]

Cheng, A., Goncalves, J.T., Golshani, P., Arisaka, K., Portera-Cailliau, C., Simultaneous two-photon calcium imaging at different depths with spatiotemporal multiplexing. Nat. Methods 2011, 8, 139–42.Google Scholar

• [34]

Yuste, R., Denk, W., Dendritic spines as basic functional units of neuronal integration. Nature 1995, 375, 682–4. Google Scholar

• [35]

Shiferaw, Y., Aistrup, G.L., Wasserstrom, J.A., Intracellular Ca2+ waves, afterdepolarizations, and triggered arrhythmias. Cardiovasc. Res. 2012, 95, 265–8. Google Scholar

• [36]

Howard, S.S., Straub, A., Horton, N., Kobat, D., Xu, C., Frequency Multiplexed In Vivo Multiphoton Phosphorescence Lifetime Microscopy. Nat. Photonics 2013, 7, 33–37. Google Scholar

• [37]

Wu, F., Zhang, X., Cheung, J.Y., et al., Frequency division multiplexed multichannel high-speed fluorescence confocal microscope. Biophys. J. 2006, 91, 2290–6.Google Scholar

• [38]

Sanders, J.S., Imaging with frequency-modulated reticles. Opt. Eng. 1991, 30, 1720.Google Scholar

• [39]

Futia, G., Schlup, P., Winters, D.G., Bartels, R.A., Spatially-chirped modulation imaging of absorbtion and fluorescent objects on single-element optical detector. Opt. Express 2011, 19, 1626–40. Google Scholar

• [40]

Sheppard, C.J.R., Mao, X.Q., Confocal Microscopes with Slit Apertures. J. Mod. Opt. 1988, 35, 1169–1185. Google Scholar

• [41]

Hamamatsu Photonics K. K., Guide to Streak Cameras. 2008. (Accessed November 8, 2015, at http://www.hamamatsu.com/ resources/pdf/sys/SHSS0006E_STREAK.pdf)

• [42]

Rietdorf, J. (Ed.), Microscopy Techniques, Springer Berlin Heidelberg, Berlin, Heidelberg 2005. Google Scholar

• [43]

Borst, J.W., Visser, A.J.W.G., Fluorescence lifetime imaging microscopy in life sciences. Meas. Sci. Technol. 2010, 21, 102002. Google Scholar

• [44]

Velten, A., Lawson, E., Bardagjy, A., Bawendi, M., Raskar, R., Slow art with a trillion frames per second camera, in: ACM SIGGRAPH 2011 Talks on - SIGGRAPH ‘11, ACM Press, New York, New York, USA 2011, p. 1. Google Scholar

• [45]

Velten, A., Willwacher, T., Gupta, O., Veeraraghavan, A., Bawendi, M.G., Raskar, R., Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging. Nat. Commun. 2012, 3, 745. Google Scholar

• [46]

Stolow, A., Bragg, A.E., Neumark, D.M., Femtosecond time-resolved photoelectron spectroscopy. Chem. Rev. 2004, 104, 1719–57. Google Scholar

• [47]

Anabitarte, F., Cobo, A., Lopez-Higuera, J.M., Laser-Induced Breakdown Spectroscopy: Fundamentals, Applications, and Challenges. ISRN Spectrosc. 2012, 2012, 1–12. Google Scholar

• [48]

Bioucas-dias, J.M., Figueiredo, M. A. T., Member, S., Iterative, A., A New TwIST?: Two-Step Iterative Shrinkage / Thresholding Algorithms for Image Restoration. IEEE Trans. Image Process. 2007, 16, 2992–3004. Google Scholar

• [49]

Selanger, K.A., Falnes, J., Sikkeland, T., Fluorescence lifetime studies of Rhodamine 6G in methanol. J. Phys. Chem. 1977, 81, 1960–1963. Google Scholar

• [50]

Zuba-Surma, E.K., Kucia, M., Abdel-Latif, A., Lillard, J.W., Rata-jczak, M.Z., The ImageStream System: a key step to a new era in imaging. Folia Histochem. Cytobiol. 2007, 45, 279–290. Google Scholar

• [51]

Mao, X., Mao, S.S., Russo, R.E., Imaging femtosecond laser-induced electronic excitation in glass. Appl. Phys. Lett. 2003, 82, 697. Google Scholar

• [52]

Sun, Q., Jiang, H., Liu, Y., Wu, Z., Yang, H., Gong, Q., Measurement of the collision time of dense electronic plasma induced by a femtosecond laser in fused silica. Opt. Lett. 2005, 30, 320-322 Google Scholar

• [53]

Liang, J., Gao, L., Hai, P., Li, C., Wang, L. V, Encrypted Three-dimensional Dynamic Imaging using Snapshot Time-of-flight Compressed Ultrafast Photography. Sci. Rep. 2015, 5, 15504. Google Scholar

Accepted: 2016-03-03

Published Online: 2016-10-20

Published in Print: 2016-09-01

Citation Information: Nanophotonics, Volume 5, Issue 4, Pages 497–509, ISSN (Online) 2192-8614, ISSN (Print) 2192-8606,

Export Citation

## Citing Articles

[1]
Hideharu Mikami, Cheng Lei, Nao Nitta, Takeaki Sugimura, Takuro Ito, Yasuyuki Ozeki, and Keisuke Goda
Chem, 2018
[2]
Jinyang Liang, Liren Zhu, and Lihong V. Wang
Light: Science & Applications, 2018, Volume 7, Number 1
[3]
Alexander Dorodnyy, Yannick Salamin, Ping Ma, Jelena Vukajlovic Plestina, Nolan Lassaline, Dmitry Mikulik, Pablo Romero-Gomez, Anna Fontcuberta i Morral, and Juerg Leuthold
IEEE Journal of Selected Topics in Quantum Electronics, 2018, Volume 24, Number 6, Page 1
[4]
Rachael J. Homans, Raja U. Khan, Michael B. Andrews, Annemette E. Kjeldsen, Louise S. Natrajan, Steven Marsden, Edward A. McKenzie, John M. Christie, and Alex R. Jones
Physical Chemistry Chemical Physics, 2018
[5]
Tom Zahavy, Alex Dikopoltsev, Daniel Moss, Gil Ilan Haham, Oren Cohen, Shie Mannor, and Mordechai Segev
Optica, 2018, Volume 5, Number 5, Page 666
[6]
Pierre-Henry Hanzard, Thomas Godin, Saïd Idlahcen, Claude Rozé, and Ammar Hideur
Applied Physics Letters, 2018, Volume 112, Number 16, Page 161106
[7]
Guilong Gao, Jinshou Tian, Tao Wang, Kai He, Chunmin Zhang, Jun Zhang, Shaorong Chen, Hui Jia, Fenfang Yuan, Lingliang Liang, Xin Yan, Shaohui Li, Chao Wang, and Fei Yin
Physics Letters A, 2017
[8]
Takakazu Suzuki, Ryohei Hida, Yuki Yamaguchi, Keiichi Nakagawa, Toshiharu Saiki, and Fumihiko Kannari
Applied Physics Express, 2017, Volume 10, Number 9, Page 092502
[9]
Baoshan Guo, Cheng Lei, Hirofumi Kobayashi, Takuro Ito, Yaxiaer Yalikun, Yiyue Jiang, Yo Tanaka, Yasuyuki Ozeki, and Keisuke Goda
Cytometry Part A, 2017, Volume 91, Number 5, Page 494
[10]
Guilong Gao, Kai He, Jinshou Tian, Chunmin Zhang, Jun Zhang, Tao Wang, Shaorong Chen, Hui Jia, Fenfang Yuan, Lingliang Liang, Xin Yan, Shaohui Li, Chao Wang, and Fei Yin
Optics Express, 2017, Volume 25, Number 8, Page 8721