Skip to content
BY 4.0 license Open Access Published by De Gruyter June 29, 2020

Spatial multiplexing holographic combiner for glasses-free augmented reality

  • Jiacheng Shi , Wen Qiao ORCID logo EMAIL logo , Jianyu Hua , Ruibin Li and Linsen Chen EMAIL logo
From the journal Nanophotonics

Abstract

Glasses-free augmented reality is of great interest by fusing virtual 3D images naturally with physical world without the aid of any wearable equipment. Here we propose a large-scale spatial multiplexing holographic see-through combiner for full-color 3D display. The pixelated metagratings with varied orientation and spatial frequency discretely reconstruct the propagating lightfield. The irradiance pattern of each view is tailored to form super Gaussian distribution with minimized crosstalk. What’s more, spatial multiplexing holographic combiner with customized aperture size is adopted for the white balance of virtually displayed full-color 3D scene. In a 32-inch prototype, 16 views form a smooth parallax with a viewing angle of 47°. A high transmission (>75%) over the entire visible spectrum range is achieved. We demonstrated that the displayed virtual 3D scene not only preserved natural motion parallax, but also mixed well with the natural objects. The potential applications of this study include education, communication, product design, advertisement, and head-up display.

1 Introduction

There are tremendous demands for augmented reality in education, design, clinics, vehicles, and exhibition. The near-eye display technologies have provided various solution to combine virtual world with physical world through glasses. Yet glasses-free augmented reality is more interesting by fusing virtual 3D images naturally with physical world without the aid of any wearable equipment. It is not only because many people do not feel comfortable to wear a pair of glasses, especially for long-time observation. But also, glasses-free augmented reality provides a solution for mutual discussion and communication. However, the implementation of glasses-free augmented reality is still not trivial.

Tremendous progress have been made in glasses-free 3D display [1]. Among all the glasses-free 3D display technologies, autostereoscopic 3D display features thin form factor, and compressed data for video rate display [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], [20]. Xinzhu Sang et al. proposed a 162-inch horizontal light field display that can reconstruct 3D images within a viewing angle of 40° by an LED panel, aspheric lens array (ALA) and holographic functional screen (HFS) [21]. David Fattal et al. proposed a new kind of directional light guide plate based on nanograting arrays. By means of a repetitive nanograting array, the emitted light can be projected to several viewing angles [22]. The progress in metasurfaces further facilitated the light manipulation of views in 3D display. Lingling Huang et al. proposed three-dimensional holography by subwavelength metallic nanorods with spatially varying orientations [23]. Metamaterial provides a promising approach for holography by tailoring the phase information with aperiodic nanostructures. However, the size of a functional metamaterial device for display purpose usually need to be larger than 2-inch, which is a great challenge in fabrication. On the basis, our group proposed 6-inch holographic sampling 3D display with pixelated metagratings to eliminate vergence–accommodation conflict for the applications of mobile devices [24], [25], [26]. In general, autostereoscopic 3D display produces decent virtual 3D images, but the problem is that the environmental background is physically blocked.

In the field of glasses-free augmented reality, Koki Wakunami et al. proposed a projection-type see-through holographic 3D display with an enlarged visual angle of 20.8° [27]. Luo Gu et al. proposed a waveguide-based head-up display with thin plate compensators. In his prototype, he projected a 2D virtual image with a field of 24° × 5° and a uniformity up to 70% [28]. Qionghua Wang et al. used a lens-array holographic optical element (LAHOE) for the mixture of virtuality and reality over a frame size of 7 × 7 cm2 [29]. Yanfeng Su et al. proposed a dual-view 3D display using a single spatial light modulator (SLM) to reconstruct the amplitude information by the layer-based Fresnel diffraction method [30]. These prototypes all worked under an illumination with narrow bandwidth. The reconstruction of full-color virtual 3D image is challenging in glasses-free augmented reality.

Here we propose a holographic combiner based on spatial multiplexing nanostructures for glasses-free augmented reality of both virtual and physical 3D objects. The concept of reconstructing virtual 3D images by reproducing amplitudes and phases of light field is adopted from holography. To minimize crosstalk and preserve motion parallax, the irradiance pattern of each view was tailored from Gaussian distribution to a super Gaussian function by geometrical transformations. What’s more, spatial multiplexing technology is adopted by designing three layers of metagratings with customized size to adjust the white balance for virtually displayed 3D scene. In a 32-inch prototype, 16 views form horizontal parallax with a viewing angle of 47°. Moreover, the transmission is above 75% over a spectrum ranged from 390 to 780 nm. Finally, we proved that the produced virtual 3D scene not only preserved natural motion parallax, but also mixed well with the natural objects.

2 Intensity distribution of each view

In conventional parallax barrier or lenticular lens based 3D display, the views are self-repeating, thus motion parallax is limited. However, glasses-free augmented reality cannot be reconstructed without motion parallax. Here we propose glasses-free augmented reality based on metagratings over a large viewing angle.

As illustrated in Figure 1(a), virtual images and natural objects are mixed together by a holographic combiner. The holographic combiner is illuminated from the back by an off-the-shelf purchased laser projector. The function of the pixelated metagratings on the holographic combiner is to sample the viewpoints and redirect the projected hybrid images to corresponding views (Figure 1(b)). Moreover, since the combiner is transparent, the virtual 3D scene merges with the physical objects right behind the combiner, creating an augmented reality world. The light manipulation and the layout of viewpoints are determined by the design of metagratings in the combiner.

Figure 1: The schematic diagram of the glasses-free augmented reality 3D display. (a) Schematic diagram of the AR 3D display. (b) Schematic of the multiview 3D display based on the holographic see-through combiner with 16-view horizontal parallax (four are shown in the figure). Each pixel contains 36 metagratings.
Figure 1:

The schematic diagram of the glasses-free augmented reality 3D display. (a) Schematic diagram of the AR 3D display. (b) Schematic of the multiview 3D display based on the holographic see-through combiner with 16-view horizontal parallax (four are shown in the figure). Each pixel contains 36 metagratings.

The vector of the metagrating in the holographic combiner is determined by the incident irradiance and the emergent light beam:

(1)kd=kiG

where |G| = 2π/Λ is the vector of metagrating, and Λ is the period of the metagrating. The kd and ki is the wave vector of the emergent and incident light beam, respectively.

When the period and the orientation of the metagratings varies, a diffraction limited viewpoint can be formed. The shape and the arrangement of the viewpoints are tailored by structures of metagratings. Although full parallax can be achieved by metagratings [26], 16 views are arranged in horizontal direction for demonstration purpose. Figure 2(a and b) show the variation of period and orientation of metagratings for 16 views with horizontal parallax. In the proposed 32-inch holographic combiner, each view consists of 10,497,600 metagratings.

Figure 2: (a) The spatial frequency and (b) the orientation variation of the metagratings designed according to the wavelength of 540 nm. The inserts illustrate the variation of structural parameters in a voxel that contains 16 pixels for 16 views. (c) The comparison of irradiance pattern with traditional Gaussian distribution (dotted line) and proposed super Gaussian distribution (full line). The crosstalk as illustrated in the shadow region is smaller benefit from the steepened slope. (d) The comparison of crosstalk between traditional Gaussian distribution and proposed super Gaussian distribution. (e) The photo of the irradiance from one view. (f) The intensity distribution in the horizontal direction of the view shown in (e). (g) The photo of the irradiance from four views. (h) The intensity distribution in the horizontal direction of the four views shown in (g).
Figure 2:

(a) The spatial frequency and (b) the orientation variation of the metagratings designed according to the wavelength of 540 nm. The inserts illustrate the variation of structural parameters in a voxel that contains 16 pixels for 16 views. (c) The comparison of irradiance pattern with traditional Gaussian distribution (dotted line) and proposed super Gaussian distribution (full line). The crosstalk as illustrated in the shadow region is smaller benefit from the steepened slope. (d) The comparison of crosstalk between traditional Gaussian distribution and proposed super Gaussian distribution. (e) The photo of the irradiance from one view. (f) The intensity distribution in the horizontal direction of the view shown in (e). (g) The photo of the irradiance from four views. (h) The intensity distribution in the horizontal direction of the four views shown in (g).

The crosstalk between views is an important parameter to evaluate the performance of 3D display. So crosstalk along the viewing angle can be defined as the following [31]:

(2)CTi(θ)=j=1NwA,j(θ)wA,i(θ)wA,i(θ)=j=1NwA,j(θ)wA,i(θ)1

where N is the number of views, wA,i(θ) and wA,j(θ) are the light intensity at the ith and jth views, respectively.

Finally, the crosstalk CT can be defined as:

(3)CT=j=1NCTj(θi)N

where CTj(θi) refers to the angle where crosstalk reaches minimum value at the jth view.

We consider four views with angular separations of 3°. The irradiance patterns of Gaussian distribution and super Gaussian distribution are illustrated in Figure 2(c). The overlapped regions illustrate crosstalk or ghost images that could introduce visual fatigue during observation. The non-uniformity of the luminance based on Gaussian distributed views is 15.245%, with a crosstalk of 9.05% (Figure 2(d)). In comparison, the irradiance based on super Gaussian distribution with flattened top and steepened slop exhibit a non-uniformity of 0% and a crosstalk of 0.42%.

In practice, because of the finite pixel size of each metagrating, the light intensity modulated by single metagrating exhibits Gaussian distribution. Mathematically, a super Gaussian function can be approximated by a series of Gaussian functions. As shown in the insert of Figure 2(a and b), we approximated the super Gaussian distribution of each view by geometrical transformations of 36 pixelated metagratings. So there are 576 metagratings grouped in 16 views for each voxel in the holographic combiner. To validate the concept, we fabricated a holographic combiner with four views. Figure 2(e–f) shows the photos and fitting curve of the irradiance pattern for a single view. It proves the feasibility of forming a super Gaussian function by geometrical transformations of 36 pixelated metagratings. The viewing angle of one view is 7.6° and 2.93° in the vertical direction and the horizontal direction, respectively. Figure 2(g–h) further shows the fitting curve of the irradiance pattern for four views arranged horizontally. It demonstrated a good uniformity in the luminance with smooth motion parallax and minimized crosstalk.

3 Spatial multiplexing for full-color display

Metagratings are wavelength sensitive, but each pixel under the irradiance of laser projector is simultaneously modulated by R/G/B wavelength. Here we propose a full color display by spatial multiplexing. The central wavelength of the laser projector is 609, 540 and 456 nm for red, green, and blue, respectively. One should note that the bandwidth of each color is relatively wide. The metagrating in each layer is designed with different period (Figure 3a and b), but with same orientation to combine three colors into each view. The spatial frequencies of the metagratings are in the range of 357∼2157 line/mm, 414∼2502 line/mm and 477∼2886 line/mm, respectively. The minimum periodic variation of metagrating is 0.645 nm. As shown in Figure 3(f), three layers of metagratings are stacked together for the modulation of R/G/B color.

Figure 3: (a) The spatial frequencies of the metagratings for the central wavelength of 609, 540 and 456 nm are in the range of 357∼2157 line/mm, 414∼2502 line/mm and 477∼2886 line/mm, respectively. (b) The variation of spatial frequencies of the metagratings within a voxel for RGB color. (c) The SEM image of one subpixel with three layers of metagratings. (d) The SEM image of one pixel containing 36 subpixels. (e) The Microscopic picture of the holographic combiner. (f) The schematic diagram of spatial multiplexing. (g, h) Comparison between the image projected on monolayer metagrating and that on 3-layer metagratings with spatial multiplexing.
Figure 3:

(a) The spatial frequencies of the metagratings for the central wavelength of 609, 540 and 456 nm are in the range of 357∼2157 line/mm, 414∼2502 line/mm and 477∼2886 line/mm, respectively. (b) The variation of spatial frequencies of the metagratings within a voxel for RGB color. (c) The SEM image of one subpixel with three layers of metagratings. (d) The SEM image of one pixel containing 36 subpixels. (e) The Microscopic picture of the holographic combiner. (f) The schematic diagram of spatial multiplexing. (g, h) Comparison between the image projected on monolayer metagrating and that on 3-layer metagratings with spatial multiplexing.

As aforementioned, the holographic combiner is covered by metagratings with varied orientation and period. More than 960 × 540 pixels with 167,961,600 metagratings need to be patterned on the combiner over a format of 32 inch in diagnose. So the fabrication process of the holographic combiner is of great challenge. A self-developed interference lithography system (NANOCRYSTAL200, SVG Optronics) was adopted to fabricate the device at a high throughput.

Figure 3(c) shows the SEM image of a sub-pixel with a size of 61.7 × 61.7 μm. Three layers of metagratings are spatial multiplexed and a moire patterned is observed due to the slightly differences in the spatial frequency. Due to the variation of diffraction efficiency along wavelength, the aperture size of the metagrating in each layer is varied for white balance. In the prototype, the aperture is 625 × 4 µm2, 625 × 4 µm2, and 437.5 × 1 µm2, corresponding to red, green, and blue light, respectively.

For the sake of high transparency, the pixels are sparsely arranged with a filling factor less than 25%. As shown in Figure 3(d), 6 × 6 subpixels attribute to 36 focal spots to tailor the super Gaussian irradiance. 4 × 4 pixels are arranged in each voxel, corresponding to 16 viewpoints. The size of each voxel is 0.37 × 0.37 mm2 (Figure 3(e)).

As illustrated in Figure 3(g), the image taken from the combiner with only one layer of metagratings turned greenish. In comparison, the projected 2D photo exhibits correct color cue and white balance when spatial multiplexing is adopted (Figure 3(h)).

4 Characteristics of glasses-free augmented reality

To validate the proposed glasses free augmented reality based on metagratings, we built a 32-inch 3D display system by integrating a 32-inch holographic combiner with a laser projector (ZH33, Optoma), as shown in Figure 4(a). The typical parameters of the prototype are listed in Table 1. The laser projector is placed 950 mm behind the holographic combiner. The divergent light from the laser projector illuminates the combiner at an incident angle of around 20.6812° (from −4.6487° to −25.3299°) in the vertical direction and 37.12° (from −18.56° to 18.56°) in the horizontal direction. The emergent light rays from the combiner are redirected horizontally to 16 views with an angular separation of 2.93° and a viewing angle of 47°.

Figure 4: (a) The prototype of a glasses-free augmented reality 3D display. (b–d) Projected 3D images of a mermaid observed from left, middle, and right view. (e–g) Projected 3D images of a book observed from left, middle, and right view in dark room.
Figure 4:

(a) The prototype of a glasses-free augmented reality 3D display. (b–d) Projected 3D images of a mermaid observed from left, middle, and right view. (e–g) Projected 3D images of a book observed from left, middle, and right view in dark room.

Table 1:

Typical parameters of the prototype for 32-inch glasses-free augmented reality.

Parametersvalues
Frame size710.4 × 399.6 mm
View number16
Pitch of pixels0.64 mm
Pitch of sub-pixel61.7 µm
The size of sub-pixel54 × 54μm
Angular separation2.93°
Viewing angle47°
Resolution960 × 540
Refresh rate60 Hz
Central wavelength | Spectral bandwidthR: 609 nm | 41.5 nm
G: 540 nm | 87.8 nm
B: 456 nm | 1.9 nm

To test the optical performance of the glasses-free augmented reality 3D display, we projected several hybrid-perspective images on the holographic combiner, from which perspective images are separated and re-directed to each view. The projected virtual 3D scene can be observed at a viewing distance from 1000 to 1670 mm in front of the holographic combiner. We captured the images with a camera (D810, Nikon) at the viewing distance of 1200 mm. The captured virtual image preserves natural motion parallax as a real object (Figure 4(b–g)). In contrast, cylindrical lens array or parallax barrier based 3D displays provide limited head movement with self-repeating views.

As shown in Figure 5(a), the holographic combiner features great transparency, which allows the physical objects placed right behind the screen to be observed clearly without any distortion. The metagratings are patterned within the white dotted rectangular area. We measured the transmittance efficiency of the holographic combiner by a UV–visible spectrophotometer (SPECORD 210 PLUS, Analytik Jena) as shown in Figure 5(b). The transmittance is above 78% from the spectrum range of 403 to 780 nm. We also measured with transmission spectrum of blank glass and the glass with blank photoresist. The influence of the metagratings is within 10%, while the reflection of the blank glass substrate contributes to about 10% of the reduction in transmittance. By applying an anti-reflection coating on the surface of the device, we can further improve the transparency of the holographic combiner. We further tested the optical performance of the glasses-free augmented reality 3D display by projecting a virtual 3D scene with a lotus and two gold fishes in the viewing zone. The virtual object fuses naturally with the real bromeliad on behind (Visualization 1).

Figure 5: (a) Photo of the holographic see-through combiner with a bromeliad placed behind. The white dots illustrates the area that covered with metagratings. (b) Transmittance curves of the holographic combiner based on pixelated metagratings. (c–e) The effect of glasses-free augmented reality 3D display. The virtual displayed two fishes and lotus fuses well with the bromeliad physically placed behind the device (see Visualization 1).
Figure 5:

(a) Photo of the holographic see-through combiner with a bromeliad placed behind. The white dots illustrates the area that covered with metagratings. (b) Transmittance curves of the holographic combiner based on pixelated metagratings. (c–e) The effect of glasses-free augmented reality 3D display. The virtual displayed two fishes and lotus fuses well with the bromeliad physically placed behind the device (see Visualization 1).

5 Discussion

The initial thought was that the three lasers based projector will provide a better performance because of the narrow bandwidth in the wavelength of illumination. The projector based on single laser, LEDs or mercury bulbs also can provide reasonably good display as cost-friendly solutions.

In some applications, such as head-up display and window display, a compact format is critical. The distance between the laser projector and the holographic combiner in current prototype can be further reduced by choosing a laser projector with ultra-short focal length. Alternatively, waveguide configuration can be used in the holographic combiner for directional illumination.

In summary, we have proposed a 32-inch glasses-free augmented reality 3D display by using a laser projector and a holographic see-through combiner. We tailored the irradiance pattern to form super Gaussian distribution of each view by geometrical transformations of 36 metagratings. The crosstalk of each view can be eliminated to 0.42% theoretically. We further proposed spatial multiplexing technology with three layers of metagratings to adjust the white balance. Sixteen views form the horizontal parallax with a viewing angle of 47°. Moreover, the transmittance is above 75% over a spectrum ranged from 390 to 780 nm. Finally, we demonstrated the produced virtual 3D scene not only preserved natural motion parallax, but also mixed well with natural objects.

With only two physical components, the proposed glasses-free augmented reality 3D display technology has the feature of simple setup. Although a 32-inch prototype is demonstrated here, the projection-based solution can be easily enlarged to a larger display format. The potential applications include education, communication, product design, advertisement, and head-up display.

6 Methods

Fabrication of holographic combiner: Firstly, we cut and pre-cleaned a glass plate. The glass substrate was then coated with positive photoresist (RJZ-390, RUIHONG Electronics Chemicals). Three layers of pixelated metagratings were successively patterned by a self-developed interference lithography system (NANOCRYSTAL200, SVG Optronics). The pixel size of each metagrating was adjusted by changing the aperture size in the interference lithography system for white balance. It took 6 days to pattern 167,961,600 metagratings over a size of 32-inch. After photolithography, nanostructures were developed in NaOH solution and blown dry. Although the lithography process is time consuming, the surface relief nanostructures can be effectively mass transferred to a flexible PET film by UV nanoimprinting technology. The PET film that was covered with metagratings can be easily fixed at any transparent window or substrate for display purpose.

Measurement of irradiance distribution: The irradiance pattern of each view was measured by a CCD camera (D810, Nikon) placed at 1000 mm away from the 4-view holographic combiner.

Calculation of non-uniformity of luminance: A full-screen bright image is used for measurement. The irradiance difference is compared along the view angle. The non-uniformity in luminance can be calculated by the following equation [31]:

Δw=max(w(θi))min(w(θj))max(w(θi))

where w(θi) means the intensity at θi, θi and θj are between two adjacent views.


Corresponding authors: Wen Qiao, School of Optoelectronic Science and Engineering & Collaborative Innovation Center of Suzhou Nano Science and Technology, Soochow University, Suzhou, 215006, China; Key Lab of Advanced Optical Manufacturing Technologies of Jiangsu Province & Key Lab of Modern Optical Technologies of Education Ministry of China, Soochow University, Suzhou, 215006, China, E-mail: ; and Linsen Chen, School of Optoelectronic Science and Engineering & Collaborative Innovation Center of Suzhou Nano Science and Technology, Soochow University, Suzhou, 215006, China; Key Lab of Advanced Optical Manufacturing Technologies of Jiangsu Province & Key Lab of Modern Optical Technologies of Education Ministry of China, Soochow University, Suzhou, 215006, China; SVG Optronics, Co., Ltd, Suzhou, 215026, China, E-mail:

Award Identifier / Grant number: 61975140

Award Identifier / Grant number: 61575135

Funding source: Leading Technology of Jiangsu Basic Research Plan

Award Identifier / Grant number: BK20192003

Funding source: Suzhou Natural Science Foundation of China

Award Identifier / Grant number: SYG201930

Funding source: Priority Academic Program Development (PAPD) of Jiangsu Higher Education Institutions

  1. Author Contribution: W. Qiao and L. Chen conceived the research and designed the experiments. J. Shi, J. Hua, and R. Li conducted the experiment. W. Qiao and J. Shi wrote the paper. All authors discussed the results and commented on the paper.

  2. Research funding: This work was financially supported by the Natural Science Foundation of China (NSFC) (61975140 and 61575135), Leading Technology of Jiangsu Basic Research Plan (BK20192003), the Suzhou Natural Science Foundation of China (SYG201930), and the project of the Priority Academic Program Development (PAPD) of Jiangsu Higher Education Institutions.

  3. Conflict of interest statement: The authors declare the following competing financial interest: W. Qiao, L. Chen, and J. Shi are co-inventors on a related pending patent application.

References

[1] J. Geng, “Three-dimensional display technologies,” Adv. Opt. Photonics, vol. 5, pp. 456–535, 2013, https://doi.org/10.1364/aop.5.000456.Search in Google Scholar

[2] G. J. Lv, B. C. Zhao, F. Wu, W. X. Zhao, Y. Z. Yang, and Q. H. Wang, “Autostereoscopic 3D display with high brightness and low crosstalk,” Appl. Opt., vol. 56, pp. 2792–2795, 2017, https://doi.org/10.1364/ao.56.002792.Search in Google Scholar

[3] N. Okaichi, M. Miura, J. Arai, M. Kawakita, and T. Mishina, “Integral 3D display using multiple LCD panels and multi-image combining optical system,” Opt. Express, vol. 25, pp. 2805–2817, 2017, https://doi.org/10.1364/oe.25.002805.Search in Google Scholar

[4] X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express, vol. 26, pp. 8883–8889, 2018, https://doi.org/10.1364/oe.26.008883.Search in Google Scholar PubMed

[5] L. Yang, X. Sang, X. Yu, et al., “A crosstalk-suppressed dense multi-view light-field display based on real-time light-field pickup and reconstruction,” Opt. Express, vol. 26, pp. 34412–34427, 2018, https://doi.org/10.1364/oe.26.034412.Search in Google Scholar PubMed

[6] L. Ni, Z. Li, H. Li, and X. Liu, “360-degree large-scale multiprojection light-field 3D display system,” Appl. Opt., vol. 57, pp. 1817–1823, 2018, https://doi.org/10.1364/ao.57.001817.Search in Google Scholar

[7] X. Xia, X. Zhang, L. Zhang, P. Surman, and Y. Zheng, “Time-multiplexed multi-view three-dimensional display with projector array and steering screen,” Opt. Express, vol. 26, pp. 15528–15538, 2018, https://doi.org/10.1364/oe.26.015528.Search in Google Scholar

[8] Y. Su, Z. Cai, Q. Liu, L. Shi, F. Zhou, and J. Wu, “Binocular holographic three-dimensional display using a single spatial light modulator and a grating,” J. Opt. Soc. Am. A Opt. Image Sci. Vis., vol. 35, pp. 1477–1486, 2018, https://doi.org/10.1364/josaa.35.001477.Search in Google Scholar PubMed

[9] S. Xing, X. Sang, X. Yu, et al., “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express, vol. 25, pp. 330–338, 2017, https://doi.org/10.1364/oe.25.000330.Search in Google Scholar PubMed

[10] H. Liao, “Super long viewing distance light homogeneous emitting three-dimensional display,” Sci. Rep., vol. 5, p. 9532, 2015, https://doi.org/10.1038/srep09532.Search in Google Scholar PubMed PubMed Central

[11] H. Fan, Y. Zhou, J. Wang, et al., “Full resolution, low crosstalk, and wide viewing angle Auto-stereoscopic display with a hybrid spatial-temporal control using free-form surface backlight unit,” Journal of Display Technology, vol. 11, pp. 620–624, 2015, https://doi.org/10.1109/jdt.2015.2425432.Search in Google Scholar

[12] M. H. Song, J. S. Jeong, M. U. Erdenebat, K. C. Kwon, N. Kim, and K. H. Yoo, “Integral imaging system using an adaptive lens array,” Appl. Opt., vol. 55, pp. 6399–6403, 2016, https://doi.org/10.1364/ao.55.006399.Search in Google Scholar PubMed

[13] L. Luo, Q.-H. Wang, Y. Xing, H. Deng, H. Ren, and S. Li, “360-degree viewable tabletop 3D display system based on integral imaging by using perspective-oriented layer,” Opt. Commun., vol. 438, pp. 54–60, 2019, https://doi.org/10.1016/j.optcom.2019.01.013.Search in Google Scholar

[14] J. H. Lee, J. Park, D. Nam, S. Y. Choi, D. S. Park, and C. Y. Kim, “Optimal projector configuration design for 300-Mpixel multi-projection 3D display,” Opt. Express, vol. 21, pp. 26820–26835, 2013, https://doi.org/10.1364/oe.21.026820.Search in Google Scholar

[15] L. Ni, Z. Li, H. Li, and X. Liu, “360-degree large-scale multiprojection light-field 3D display system,” Appl. Opt., vol. 57, pp. 1817–1823, 2018, https://doi.org/10.1364/ao.57.001817.Search in Google Scholar PubMed

[16] http://holografika.com/722rc/.Search in Google Scholar

[17] D. E. Smalley, E. Nygaard, K. Squire, et al., “A photophoretic-trap volumetric display,” Nature, vol. 553, pp. 486–490, 2018, https://doi.org/10.1038/nature25176.Search in Google Scholar PubMed

[18] R. Hirayama, D. M. Plasencia, N. Masuda, and S. Subramanian, “A volumetric display for visual, tactile and audio presentation using acoustic trapping,” Nature, vol. 575, pp. 320–323, 2019, https://doi.org/10.1038/s41586-019-1739-5.Search in Google Scholar PubMed

[19] H. Ren, G. Briere, X. Fang, et al., “Metasurface orbital angular momentum holography,” Nat. Commun., vol. 10, p. 2986, 2019, https://doi.org/10.1038/s41467-019-11030-1.Search in Google Scholar PubMed PubMed Central

[20] J. Park, K. Lee, and Y. Park, “Ultrathin wide-angle large-area digital 3D holographic display using a non-periodic photon sieve,” Nat. Commun., vol. 10, p. 1304, 2019, https://doi.org/10.1038/s41467-019-09126-9.Search in Google Scholar PubMed PubMed Central

[21] S. Yang, X. Sang, X. Yu, et al., “162-inch 3D light field display based on aspheric lens array and holographic functional screen,” Opt. Express, vol. 26, pp. 33013–33021, 2018, https://doi.org/10.1364/oe.26.033013.Search in Google Scholar

[22] D. Fattal, Z. Peng, T. Tran, et al., “A multi-directional backlight for a wide-angle, glasses-free three-dimensional display,” Nature, vol. 495, pp. 348–351, 2013, https://doi.org/10.1038/nature11972.Search in Google Scholar PubMed

[23] L. Huang, X. Chen, H. Mühlenbernd, et al., “Three-dimensional optical holography using a plasmonic metasurface,” Nat. Commun., vol. 4, p. 2808, 2013, https://doi.org/10.1038/ncomms3808.Search in Google Scholar

[24] W. Wan, W. Qiao, W. Huang, et al., “Efficient fabrication method of nano-grating for 3D holographic display with full parallax views,” Opt. Express, vol. 24, pp. 6203–6212, 2016, https://doi.org/10.1364/oe.24.006203.Search in Google Scholar

[25] W. Wan, W. Qiao, W. Huang, et al., “Multiview holographic 3D dynamic display by combining a nano-grating patterned phase plate and LCD,” Opt. Express, vol. 25, pp. 1114–1122, 2017, https://doi.org/10.1364/oe.25.001114.Search in Google Scholar

[26] W. Wan, W. Qiao, D. Pu, et al., “Holographic sampling display based on metagratings,” iScience, vol. 23, p. 100773, 2019, https://doi.org/10.1016/j.isci.2019.100773.Search in Google Scholar PubMed PubMed Central

[27] K. Wakunami, O. Y. Hsieh, R. Oi, et al., “Projection-type see-through holographic three-dimensional display,” Nat. Commun., vol. 7, p. 12954, 2016, https://doi.org/10.1038/ncomms12954.Search in Google Scholar PubMed PubMed Central

[28] L. Gu, D. Cheng, Q. Wang, et al., “Design of a uniform-illumination two-dimensional waveguide head-up display with thin plate compensator,” Opt. Express, vol. 27, pp. 12692–12709, 2019, https://doi.org/10.1364/oe.27.012692.Search in Google Scholar PubMed

[29] H. Deng, C. Chen, M. Y. He, J. J. Li, H. L. Zhang, and Q. H. Wang, “High-resolution augmented reality 3D display with use of a lenticular lens array holographic optical element,” J. Opt. Soc. Am. A Opt. Image Sci. Vis., vol. 36, pp. 588–593, 2019, https://doi.org/10.1364/josaa.36.000588.Search in Google Scholar PubMed

[30] Y. Su, Z. Cai, Q. Liu, et al., “Projection-type dual-view holographic three-dimensional display and its augmented reality applications,” Opt. Commun., vol. 428, pp. 216–226, 2018, https://doi.org/10.1016/j.optcom.2018.07.061.Search in Google Scholar

[31] M. Salmimaa and T. Järvenpää, “3-D crosstalk and luminance uniformity from angular luminance profiles of multiview autostereoscopic 3-D displays,” Journal of the Society for Information Display, vol. 16, no. 10, pp. 1033–1040, 2008, https://doi.org/10.1889/jsid16.10.1033.Search in Google Scholar

Received: 2020-04-20
Accepted: 2020-06-01
Published Online: 2020-06-29

© 2020 Jiacheng Shi et al., published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 28.3.2024 from https://www.degruyter.com/document/doi/10.1515/nanoph-2020-0243/html
Scroll to top button