1 A story about the nick-names of laser radar
The term ‘radar’ is usually defined as a means for getting the information about remote objects or media using mostly microwave radiation. Although the optical detection and ranging was demonstrated almost simultaneously with building the first radiowave radars, the adequate term for optical radar did not exist. With the advent of laser, the physicist community enchanted by the invention, introduced a term ‘laser radar’ implying no more than using the laser for radar purposes. This involved a controversy in the definition allowing the interpretation of the term as getting the information in microwaves with laser, which seemed to have no sense.
Recent successes in converting the laser radiation into the microwaves by using photonics technologies demonstrated that getting remotely the information in microwaves with laser has sense. This laser microwave radar was baptized with a nick-name ‘photonics radar’.
Another concern followed from the concept of ‘remote sensing’, meaning at least kilometers for the microwave radar. A laser radar, nick-named the ‘OCT’ (optical coherence tomography), dealing with millimeters and even micrometers of distance, is also using the radar principles. Encyclopedia of Modern Optics defines them as laser microradars. Still other numerous applications are based on combinations of different physical phenomena interacting with laser radiation having the attributes of laser radar.
In this review, we made a look into some examples of a variety of laser-based nick-named technologies accentuating their being the members of the laser radar family.
2 Laser microwave radar
2.1 Photonics radar
The worlds of radiofrequency engineering and optoelectronics have attracted the interest over the past 30 years. The technology allows performing complex functions in microwave systems like spectrum analysis, time-delay beam forming, adaptive and programmable filtering, correlation, and waveform generation, some of them being impossible in the radiofrequency domain, like the ability to multiplex several wide bandwidth, high-frequency signals on a single strand of optical fiber, etc. . An inviting option is combining the laser frequencies to get the microwaves.
Laser irradiation can be subjected to linear operations (e.g. amplification) or non-linear transformations (up- and down-conversion, frequency doubling, modulation, heterodyning, etc.). A non-linear transformation changes the spectral composition of the initial laser radiation including its shifting to higher or lower position in the spectrum, even to the microwaves. Figure 1 demonstrates a dual-frequency laser transmitter  where a microwave signal is generated via optical heterodyning of two laser frequencies f1 and f2 with an offset of fc=|f1–f2|, where fc can be a frequency in microwaves (including millimeters). Additional information can be added by modulation. Photodetector transforms frequency difference |f1–f2| into a radio frequency fc as a carrier of that additional information.
Technically, this general concept can be implemented in many different ways. The principal point here is to keep the microwave frequency offset fc between two laser signals constant or to have it controlled in accordance with the strategy of signal processing. The simplest solution to get a stable frequency offset is to make the two optical signals interfere from two different oscillation modes of a single dual-mode laser .
Another way is the optical frequency shifting. One of the versions consists of splitting a master laser signal and shifting the frequency of one of them using a magneto-optic Bragg cell . It was also proposed that a single sideband (SSB) modulation of a master laser signal , and the use of the central and the SSB components to generate their difference. As a version of the SSB technique, a suppressed carrier double sideband (SC-DSB) modulation of a master laser signal was described , with its further modification .
Injection locking of one  or two  slave lasers by a master laser, as well as phase locking of a slave laser to a master laser  are the other ways to generate two optical signals whose interference would result in the microwave frequency offset fc between these two laser-generated signals. Usually, both laser signals are transmitted to optics-to-electronics (O/E) converter through a fiber link. The O/E converter can be a photodiode. The resulting beat signal provides a microwave carrier with a frequency of fc.
On its way to the antenna, the carrier fc can be enriched by any kind of information according to the goals of the radar. This procedure is much more advantageous when done before the difference |f1–f2| is converted into fc, i.e. in the optical part of the system, for example, by modulating one or both of the two laser signals as shown in Figure1. Photonic crystals as a modality to modify the signal from optics to microwaves pinpointed the nick-name ‘photonics radars’. For more details on mentioned transforms, the reader can access the tutorial .
A prototype of the photonics-based radar transceiver for demonstration tests was implemented in the European PHODIR project with electronic components added at the RF front-end: RF circulator, switches, amplifiers, filters, and monostatic antenna .
An RF signal is generated by selecting pairs of modes from frequency spectrum f0±ΣFi of the mode-locked laser, to which an intermediate frequency Fm is added, resulting in the frequency (f0+Fi+Fm) filtered out at the output of the photodiode serving as a signal carrier chosen to be 9.9 GHz. Shifting the baseband modulating signal to an intermediate frequency Fm resolves the problem of the laser pulse repetition rate and the optical sampling rate employed in the receiving unit (Figure 2).
The other parameters were as follows: maximum instantaneous bandwidth 40 MHz, pulse duration 1 μs, pulse repetition rate 10 kHz, the coherent integration time 20 ms, i.e. about 200 integrated pulses for a Doppler resolution of 0.55 m/s over an unambiguous velocity of 76.4 m/s. The transmitted peak power was set to 20 W.
One cycle of tests was detection of civilian airplanes during the take-off/landing maneuvers. Another trial was run in a maritime scenario aimed at a long-range detection, targeting the cargo ships and ferries offshore, as well as a short range detection of maneuvering targets.
The maritime field trial was run in cooperation with commercial coherent radar in the X band, the Sea Eagle, the top-product of GEM Elettronica. The analysis of the tests showed very similar fundamental features: the minimum detectable signal was −90 dBm for the Sea Eagle and 87 dBm for the PHODIR, while the noise figures were 5 dB and 8 dB, respectively demonstrating good perspective for photonic radar if using specially aimed photonics components.
Getting high-resolution image in microwaves requires wide-band signals. In the HUSIR project , the W band is exploited with the signal bandwidth of about 10% of the carrier. To have acceptable time resource for target search, the system was designed in the combination with the X band radar. To demonstrate a similar potential with laser/photonics based radars, experimental studies with dual-band system were also undertaken .
High resolution of the imaging laser/photonics radar in Ka-band was achieved with 10 GHz bandwidth . In the transmitter channel, continuous ultra-wideband linear frequency modulation wave is generated based on optical frequency sextupling technique (Figure 3).
The light signal generated by a laser source is modulated by a Mach-Zehnder modulator MZM1, which is biased at null point to suppress the optical carrier. An electric linear frequency modulation wave at the intermediate frequency generated by a direct digital synthesizer (DDS) is applied to the radio-frequency port of the MZM1 to modulate the intensity of the light wave. The output signal of the MZM1 can be written as:
where ω0, K0, and T are the center frequency, chirp rate, and temporal period of the IF signal, correspondingly.
After photoelectric conversion by the photodetector 1, not only the center frequency but also the bandwidth is six times to the original signal from DDS. However, the temporal time remains unchanged. The high-frequency ultra-wideband radar signal is amplified by a power amplifier and sent to the targets through a transmitting antenna.
The incoming echoes are collected by a receiving antenna and amplified by a radio frequency amplifier which is composed of a low noise amplifier and a power amplifier before applying to the MZM2 fed from output of MZM1 via an optical time delay (OTD) fiber. In the receiver, a combination of optical frequency mixer with fiber delay lines and electric analog-to-digital converter is capable of receiving target echoes and imaging targets from different distances.
Several types of targets were imaged including an unmanned aerial vehicle (UAV), an airplane and a civil building. Figure 4 shows an example of a wing image of the Boeing 737. The range resolution was calibrated to be 1.68 cm.
2.2 Low-coherence laser micro-radars
2.2.1 Optical coherence tomography
The term ‘Optical Coherence Tomography (OCT)’ is a controversial one. First of all, the word ‘tomography’ means the technique of getting a cross-section of an object from its projections by the use of Radon’s transforms, in the case of OCT, neither different aspect projections nor Radon’s transforms were used. In opposite to an X-ray tomography, that was waiting about half a century on the emergence of computers, the OCT could be invented much earlier if lasers existed.
Secondly, ‘coherence’; in this case would better sound as the lack of coherence. The lower the coherence, higher is the resolution of the instrument. The first achievements of the low-coherence interferometric reflectometry were to get the signals from the boundaries of the layers of a matter, an eye being one of such objects. After the signals were received, the next step was to move the laser beam in lateral directions and build a 3D distribution of the reflectance from the layers. Getting an 3D image instigated to use the term ‘tomogram’.
There are two types of low-coherence inteferometers: time-domain (TD) and frequency-domain (FD). TD interferometers use low-coherence light from a super-luminescent diode. The light is fed into a fiber-optic coupler that splits the light beam into two arms (paths), one directed at the sample surface, the other at a scanning reference mirror. The detector then captures the interference of light signals reflected back from these two arms. Constructive interference was observed as the intensity maximum when the optical paths of both arms are exactly equal. By scanning the length of the reference arm, the precise positions of the reflection points/zones in the sample can be determined.
Frequency-domain interferometers with swept source (SS) use light from a fast sweeping laser source instead of a super-luminescent diode. The reference mirror is fixed. The detector captures the spectrum of the interference pattern in time and then converts this spectrum to the time domain using the Fourier transform.
The first biological application of low-coherence interferometric reflectometry for measurement of the axial eye length was reported by Fercher et al. in 1988 . Different versions of low-coherence interferometry were developed for noninvasive measurement in biological tissues. In the eye, there are several consecutive positions of the reference mirror along the z axis, where the interference gives maximal signal (Figure 5 left). These positions correspond to the cornea, lens, and retina. Shifting the laser beam laterally in (x, y) directions yields the profiles of these surfaces. Figure 5 right shows an example of the structure of the human retina reconstructed from the data of measured intensity of light scattered from the eye bottom, the beam being shifted laterally scan-by-scan.
In 1991, Huang et al. baptized the interferometric laser radar technique  as the optical coherence tomography (OCT) having omitted the word ‘low’ (or ‘short’), which is in use when meaning the interferometric reflectometry. OCT imaging system based on a fiber-optic Michelson interferometer is shown in Figure 6. The advantage of the fiber optics technology is that the setup is robust and easy in alignment. The sample/probe arm may be interfaced to a variety of control and imaging devices. TD-OCT found its application in ophthalmology for diagnosing of ocular diseases such as glaucoma, age-related macular degeneration, and diabetic retinopathy.
Later on, the spectral/Fourier domain OCT (SD-OCT) and the swept source/Fourier domain OCT (SS-OCT) were developed that is sometimes called optical frequency domain imaging (OFDI). The SD-OCT has a powerful sensitivity advantage over the TD detection, since it measures all of the echoes of light simultaneously. This discovery drove a boom in OCT research and development. The sensitivity is enhanced by 50–100 times, enabling a corresponding increase in imaging speeds.
Fourier domain detection, SS-OCT, uses an interferometer with a narrow-bandwidth frequency swept light source and detectors which measure the interference output as a function of time. It has the advantage that it does not require a spectrometer and line scan camera. Therefore, it can operate at longer wavelengths where camera technology is less developed and it can achieve imaging speeds which are much faster than spectral/Fourier domain OCT which is limited by the camera speed. The primary challenge in swept source/Fourier domain OCT is that it requires a high-speed, swept narrow line width light source.
Schematic of a typical spectral domain OCT instrument is presented in Figure 7. The reference arm has a fixed delay and is not scanned. Interference is detected with a spectrometer and a high-speed, line scan camera. A computer reads the spectrum, rescales it from wavelength to frequency, and Fourier transforms to generate axial scans.
Schematic of a typical swept source OCT instrument is shown in Figure 8. It contains a frequency swept light source. The reference arm has a fixed delay and is not scanned. The example shows a sample arm with catheter/endoscope interface and the system is assumed to operate at 1.3 μm wavelengths. This geometry enables dual balanced detection, cancelling excess noise in the laser. A portion of the frequency swept light is directed into a Mach-Zehnder interferometer which acts as a periodic frequency filter. This interferometer configuration is more efficient than the classic Michelson interferometer because all of the light is detected. It avoids limitations in spectral resolution and pixel size inherent to spectrometers and line scan cameras.
OCT technique found extensive medical applications: in neurology , pulmonology , cardiology , , angiography , urology , as well as applications in other fields such as industry  and arts , etc. OCT can be used both as an independent instrument and also in combination with the other techniques . An example is a combination of a standard cystoscope with a miniature daughter OCT endoscope . Cystoscope enables localization of volumetric OCT data to a 3D reconstruction of the bladder by detecting the arbitrary pose (position and orientation) of an OCT daughter endoscope and then using this information to register the OCT volume in the 3D bladder reconstruction. Examples of OCT imaging in cardiology  and angiography  are demonstrated in Figures 9 and 10, respectively.
3 Wavefront sensing laser micro-radar
In the middle of 1970s, the studies were intensified of the non-homogeneity of the atmosphere due to turbulence , . They were important not only for optical radars , but also for optical communication , remote power transfer with lasers , etc. Discovery of the reciprocity of the turbulent atmosphere  opened the path to correcting the wavefront distortions: in the first pass, the laser beam is directed to the target, is reflected from the target, the wavefront being measured on its path back, and the wavefront of the laser beam is conjugated for the second pass, in which it can be made with much higher energy to be delivered to the target, non-distorted.
In 1990s, a similar situation arose in ophthalmology. It was discovered by the physicists and eye surgeons that the external surface of the cornea can be reshaped with the laser, but they did not have the exact data on the refraction non-homogeneity over the eye aperture. As a reply to this request, two laser radar technologies of wavefront sensing were proposed: the first one used a single laser beam firing into the eye and a photodetector for receiving the light coming back from the eye with a lenslet matrix in front of it , , and the second one, based on the ray tracing principle of consecutive (in time) probing of the eye point-by-point with a thin laser beam , .
In Figure 11, the optics of the eye is represented as a thin lens with the retina in its focus. A narrow laser beam is directed into the eye along the visual axis, propagates through the eye media, scatters in the retina and part of the scattered light propagates back to the exit. Since the light meets different refraction conditions on its different paths, it exits the eye in the directions defined by the non-homogeneity of the refraction. The punch of light from the eye falls on the matrix of lenslets (lenslet array). In the focal plane of each lenslet, a position sensitive photodetector is installed which measures the x–y position of the focused spot of the back returned light. The easiest solution for the position measurement is the matrix of photodetectors. The name of this wavefront sensing technique contains the names of J. F. Hartmann and R. B. Shack, the last one being added in honor of Shack’s laboratory where the lenslet array was first manufactured by B. Platt.
For further calculations, the coordinates used are of the center of each lenslet and the position of the light spot, focused by the corresponding lenslet. Two-dimensional distribution (a map of the refraction errors) can be reconstructed either as an approximation of the set of the values of wavefront error in each point, or as their interpolation. The approximated surface of the wavefront errors can be reconstructed as a set of 2D polynomials. The mostly used are Zernike polynomials:
where Zm(x, y) is the mth mode of the polynomial (its mth term being a 2D function), Cm is its coefficient (‘weight’) being a real number. The polynomial description is a global presentation because each member of the set of polynomials is a component describing the whole map.
The interpolation is an example of a zonal presentation. One of the options is the cubic spline interpolation that can be presented in the form of spline gradient functions written for orthogonal directions.
The ray tracing technique uses measurement of the position of a thin laser beam projected onto the retina. The beam having passed a two-directional (x, y) acousto-optic deflector and a collimating lens is directed into the eye parallel to the visual axis (Figure 12). The front focal point of the collimating lens is positioned in the center of scanning S of the acousto-optic deflector. Each entrance point, one at a time, provides its own projection on the retina. A position sensing detector measures the transverse displacement (δx, δy) of the laser spot on the retina. An objective lens is used to optically conjugate the retina and the detector plane.
In the first approximation, both methods give the adequate data on the refraction distribution in the eye. Still, following the paths of the probing laser light (Figure 13) shows that these paths are different for Shack-Hartmann sensor (blue traces) and for ray tracing one (pink traces). The first pass of the ray tracing, physiologically corresponds to the image formation in the eye, as opposite to the second pass bringing the information on the refraction in the Shack-Hartmann method. It means that the data from these different type wavefront sensors will differ, especially for higher order aberrations.
The structure of a microradar (developed by the Institute of Biomedical Engineering, Kiev, Ukraine) for measuring the aberrations in the eye using the ray tracing method is shown in Figure 14. The laser beam is deflected in x and y directions by two acousto-optical deflectors xAOD and yAOD. To get a single-center scanning, the x center is transposed into the y center by means of a telescope. Since the angular range of scanning by the AODs is limited by 1°, at the exit of the second AOD, an aperture telescope is installed with magnification ×5. The collimation lens converting the diverging punch of rays into parallel one is installed with its front focus coinciding with the back focus of the second lens of the aperture telescope.
A conjugation telescope is installed in the way to compensate the ametropy of the eye and, at the same time, to keep each ith point of eye aperture the same for the corresponding ith laser beam. The control of the conjugation telescope is organized so that the first set of the laser series serves for determining the defocus Zernike component. From the data on defocus, the conjugation telescope is re-aligned to compensate it. Instead of moving optical parts in the conjugation telescope, the liquid lenses are very efficient  providing the switch interval between the first series of laser pulses (no conjugation) and the second series (conjugated position) about 10 ms with the total time of both series about 500 ms. The coordinates of laser projections on the retina are measured by two orthogonal photodetector linear arrays. Figure 15, top, shows an example of a wavefront map with all higher order aberrations. Figure 15, bottom, demonstrates the results of calculation of the modulation transfer function.
Clinical experience showed the importance of correct positioning of the instrument. Changing an aspect of eye positioning and orientation, relatively the optical axis of the instrument can dramatically distort the measured aberrations (Figure 16). Comparing the refraction error maps for different aspects (angles α1, α2, α3, α4, α5 between the visual axis of the eye and the optical axis of the instrument), it is evident that correcting the aberrations measured with incorrect positioning of the aberrometer, no satisfactory results would be achieved since the residual aberrations can be even higher than those measured with correct positioning. Objective determination of the visual axis was recently proposed . It is based on the analysis of foveola profile and determination of its center using a two-beam configuration with beam scanning along a circular trajectory and measuring the phase relations in the received signals.
One of the recent trends in adding new functions to the instruments for wavefront measurement is analysis of optical quality of the crystalline lens. An index of lens refraction dysfunction (a dysfunctional lens index DLI) was introduced ,  to calculate from the modulation transfer function reconstructed from the Zernike polynomials describing the refraction inhomogeneity of the crystalline lens. To get necessary data on the crystalline lens, refraction of the cornea must be subtracted. An option is to measure the total and corneal refraction synchronously with the same laser radar . An additional detection channel is to be added to measure the coordinates of the spots reflected from the cornea and corresponding software to reconstruct the refraction map of the cornea .
Still another option is possible to avoid the influence of inadequate positioning when measuring the total and corneal refraction. The method is based on measuring the lenticular refraction inhomogeneity using a single entrance point for two laser beams propagating along different paths in the lens . Two laser beams 1 and 2 oriented to each other at an angle i across the cornea in the point C1,2 (Figure 17). For simplification, the thickness of the cornea is neglected. After crossing the cornea, the beams exit being oriented at the angle r. They meet the anterior surface of the lens in the points L1a and L2a, the beams exit from the lens through its posterior surface in the points L1p and L2p. They hit the retina in the points R1 and R2.
If one of the beams (e.g. beam 1) is oriented parallel to the visual axis C0–R0, then the distance Δx defines the total refraction of the eye in the point C1,2, and the distance δx defines the lens refraction difference between the traces L1a–L1p and L2a–L2p (under a suggestion that the media in the anterior chamber and in the vitreous are homogeneous).
4 Discussion and conclusions
A common feature of the reviewed streams of the nick-named laser radars is their fast development. The difference is in their operational usage. OCT and wave front measurements became everyday clinical practice. Photonics radar is older and slower in implementation, but its perspective is no less bright. Russia is expected to complete preliminary research and development, including the building of a mockup of an ‘X-band radio-photonic radar’. It will be able to provide radio wave imaging with greater details to identify the target type .
The list of other nicknames of laser radars can be extended: focus tracking for CD/DVD players , laser Doppler vibrometry , vibration measurement of the vocal folds , blood flow measurement in retinal vessels , flow cytometry , glucose content measurement , nephelometry , smoke detection , and oxymetry , etc. In recent years, tomographic phase microscopy  became a new trend being an emerging optical microscopic technique for bioimaging, especially perspective in hematology. It is based on measurements of complex scattered fields to reconstruct three-dimensional refractive index maps of cells with diffraction-limited resolution by solving inverse scattering problems. Three-dimensional refractive index mapping of living cells by itself can implicate some pathological states that accompany human diseases. Its distribution allows to visualizing morphological alterations of red blood cells caused by parasitic protozoa, thus monitoring the flow of a disease and of its treatment.
All the above mentioned nick-named laser radars are similar to the members of the same family with their last names changed.
‘Microwave Photonics: Components to Applications and Systems’, Eds. By A. Vilcot, B. Cabon and J. Chazelas (Kluwer Academic Publishing, Dordrecht, 2005).
U. Gliese, T. N. Nielsen, S. Norskov and K. E. Stubkjaer, IEEE Trans. Microwave Theory Tech. 46, 458–468 (1998).
D. Wake, C. R. Lima and P. A. Davies, IEEE Trans. Microwave Theory Tech. 43, 2270–2276 (1995).
C. L. Wang, Y. Pu and C. S. Tsai, J. Lightwave Technol. 10, 644–648 (1992).
M. Izutsu, S. Shikama and T. Sueta, J. Quantum Electron. QE-17, 2225–2227 (1981).
J. J. O’Reilly, P. M. Lane, R. Heidemann and R. Hofstetter, Electron Lett. 28, 2309–2311 (1992).
C. T. Lin, Y. M. Lin, J. Chen, S. P. Dai, P. T. Shih, et al. Opt. Express 16, 6056–6063 (2008).
L. Goldberg, A. M. Yurek, H. F. Taylor and J. F. Weller, Electron. Lett. 21, 814–815 (1985).
L. Goldberg, H. F. Taylor and J. F. Weller, Electron. Lett. 19, 491–493 (1983).
R. C. Steele, Electron. Lett. 19, 69–70 (1983).
J. Yao, J. Lightwave Technol. 27, 314–335 (2009).
F. Scotti, F. Laghezza, G. Serafino, S. Pinna, D. Onori, et al. J. Lightwave Technol. 32, 3365–3372 (2014).
M. G. Czerwinski and J. M. Usoff, Lincoln Lab. J. 21, 28–44 (2014).
F. Laghezza, F. Scotti, D. Onori and A. Bogoni, in ‘17th Int. Radar. Sympos. Proc., 10–12 May, 2016’ (Krakow, Poland, 2016) pp. 250–253.
A. Wang, J. Wo, X. Luo, Y. Wang, W. Cong, et al. Opt. Express 26, 20708–20717 (2018).
A. F. Fercher, K. Mengedoht and W. Werner, Opt. Lett. 13, 186–188 (1988).
D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, et al. Science 254, 1178–1181 (1991).
H. Wang, C. Magnain, S. Sakadzic, B. Fischl and D. A. Boas, Biomed. Opt. Express 8, 5617–5636 (2017).
F. Feroldi, J. Willemse, V. Davidoiu, M. G. O. Gräfe, D. J. van Iperen, et al. Biomed. Opt. Express 10, 3070–3091 (2019).
H. Lu, J. Lee, S. Ray, K. Tanaka, H. G. Bezerra, et al. Biomed. Opt. Express 10, 2809–2828 (2019).
B. E. Bouma, M. Villiger, K. Otsuka and W. Y. Oh, Biomed. Opt. Express 8, 2660–2686 (2017).
B. Braaf, K. A. Vermeer, K. V. Vienola and J. F. de Boer, Opt. Express 20, 20516–20534 (2012).
T. Q. Xie, M. L. Zeidel and Y. T. Pan, Opt. Express 10, 1431–1443 (2002).
Z. Zhang, U. Ikpatt, S. Lawman, B. Williams, Y. Zheng, et al. Opt. Express 27, 13951–13964 (2019).
D. C. Adler, J. Stenger, I. Gorczynska, R. Huber and F. G. Fujimoto, Opt. Express 15, 15972–15986 (2007).
S. Liang, T. Ma, J. Teng, X. Li, J. Li, et al. Opt. Lett. 39, 6652–6655 (2014).
K. L. Lurie, R. Angst, E. J. Seibel, J. C. Liao and A. K. Ellerbee Bowden, Biomed. Opt. Express 7, 4995–5009 (2016).
R. L. Fante, in ‘Proc. IEEE’ (1975) 63, pp. 1669–1692.
R. F. Lutomirski and H. T. Yura, Appl. Opt. 10, 1651–1663 (1971).
J. H. Shapiro, B. A. Capron and R. C. Harney, IEEE/OSA Conf. Laser Eng. Appl. 16D–18D (1979).
E. V. Hoversten, R. O. Harger and S. J. Halme, Proc. IEEE 58, 1626–1650 (1970).
G. A. Landis, IEEE AES Syst. Mag. 6, 3–7 (1991).
J. H. Shapiro, J. Opt. Soc. Am. 61, 491–495 (1971).
J. Liang, B. Grimm, S. Goelz and J. F. Bille, J. Opt. Soc. Am. A 11, 1949–1957 (1994).
V. V. Molebny, V. N. Kurashov, I. G. Pallikaris and L. P. Naoumidis, Proc. SPIE 2930, 147–157 (1996).
V. V. Molebny, I. G. Pallikaris, L. P. Naoumidis, V. N. Kurashov and I. H. Chyzh, Proc. SPIE 3065, 191–199 (1997).
R. Navarro and E. Moreno-Barriuso, Opt. Lett. 24, 951–953 (1999).
V. Molebny and V. Sokurenko, Proc. VII European/I World Meet Vis Physiol Opt, 25–27.08.2014, Wroclaw, Poland 222–225 (2014).
V. Molebny, Ophthalmol. Physiol. Opt. 37, 326–332 (2017).
F. Faria-Correia, I. Ramos, B. Lopes, T. Monteiro, N. Franqueira, et al. J. Refract. Surg. 32, 244–248 (2016).
F. Faria-Correia, I. Ramos, B. Lopes, T. Monteiro, N. Franqueira, et al. J. Refract. Surg. 33, 79–83 (2017).
V. Molebny, I. Pallikaris, Y. Wakil and S. Molebny, US Pat 6,409,345, June 25, 2002.
V. Sokurenko and V. Molebny, Ophthalmol. Physiol. Opt. 29, 330–337 (2009).
V. V. Molebny, Ukr Pat Appl a201809743, Aug. 15, 2018.
K. C. Pohlmann, ‘Principles of Digital Audio. 6th edition’ (McGraw-Hill, New York, 2011).
H. Tabatabai, D. E. Oliver, J. W. Rohrbaugh and C. Papadopoulos, Sens. Imag.: Int. J. 14, 13–28 (2013).
N. A. George, F. F. M. de Mul, Q. Qiu, G. R. Rakhorst and H. K. Schutte, J. Biomed. Opt. 13, 064024 (2008).
C. E. Riva, G. T. Feke, B. Eberli and V. Benary, Appl. Opt. 18, 2301–2306 (1979).
F. E. Craig, in ‘Clinical Laboratory Hematology. 2nd edition’, Ed. By S. B. McKenzie (Pearson, Essex, UK, 2014) pp. 956–974.
Y. Zhou, N. Zeng, Y. Ji, Y. Li, X. Dai, et al. J. Biomed. Opt. 16, 015004 (2011).
J. Heintzenberg and R. J. Charlson, J. Atm. Ocean Technol. 13, 987–1000 (1996)
M. Dohl, Smoke detector. US Pat 8,941,505, Jan. 27, 2015.
F. C. Delori, Appl. Opt. 27, 1113–1125 (1988).
D. Jin, R. Zhou, Z. Yaqoob and P. T. C. So, J. Opt. Soc. Am. B 34, B64–B77 (2017).