Skip to content
BY 4.0 license Open Access Published by Oldenbourg Wissenschaftsverlag November 27, 2021

Augmented Reality for Future Mobility: Insights from a Literature Review and HCI Workshop

  • Andreas Riegler

    Andreas Riegler is a researcher and principal investigator at the research group Mobile Interactive Systems (MINT) of the University of Applied Sciences Upper Austria. His research interests include Automotive Computing, Mobile Computing and Human-Computer Interaction. He is currently researching the use of Augmented and Virtual Reality in the context of automated driving.

    EMAIL logo
    , Andreas Riener

    Andreas Riener is a professor for Human-Machine Interaction and Virtual Reality at Technische Hochschule Ingolstadt (THI), Germany with co-appointment at CARISSMA (Center of Automotive Research on Integrated Safety Systems and Measurement Area). He is further leading the Human-Computer Interaction Group (HCIG) at THI. His research interests include driving ergonomics, driver state estimation from physiological measures, human factors in driver-vehicle interfaces as well as topics related to (over)trust, acceptance, and ethical issues in automated driving.

    and Clemens Holzmann

    Clemens Holzmann is a full professor at the School of Informatics, Communications and Media of the University of Applied Sciences Upper Austria. Since 2018, he is also vice president for IT of the University of Applied Sciences Upper Austria. His research interests include Mobile Computing and Human-Computer Interaction. He is further leading the research group Mobile Interactive Systems (MINT) at University of Applied Sciences Upper Austria.

From the journal i-com

Abstract

There is a growing body of research in the field of interaction between drivers/passengers and automated vehicles using augmented reality (AR) technology. Furthering the advancements and availability of AR, the number of use cases in and around vehicles rises. Our literature review reveals that in the past, AR research focussed on increasing road safety and displaying navigational aids, however, more recent research explores the support of immersive (non-)driving related activities, and finally enhance driving and passenger experiences, as well as assist other road users through external human-machine interfaces (HMIs). AR may also be the enabling technology to increase trust and acceptance in automated vehicles through explainable artificial intelligence (AI), and therefore help on the shift from manual to automated driving. We organized a workshop addressing AR in automotive human-computer interaction (HCI) design, and identified a number of challenges including human factors issues that need to be tackled, as well as opportunities and practical usages of AR in future mobility. We believe that our status-quo literature analysis and future-oriented workshop results can serve as a research agenda for user interface designers and researchers when developing automotive AR interfaces.

1 Introduction

As automated driving (AD) systems and corresponding human-machine interfaces (HMI) advance, the usage of augmented reality (AR) technology for both in-vehicle experiences and other road users is explored. In 2021, the Society of Automotive Engineers [35] released the latest version of their definition of vehicle automation, thereby providing the basis for future standardization, as well as introducing a common language. While SAE Level 2 (L2) vehicles are already permitted to drive on the road, more advanced mobility solutions such as SAE Level 3 (L3) automated vehicles (AVs), are expected to become commercially available in the near future. The intersection of automated driving and augmented reality enables usability and user experience (UX) researchers and practitioners to explore and evaluate novel automotive user interfaces. Recent literature reviews (e. g., [39], [175]) on HMI concepts covered information presentation along the reality-virtuality (RV) continuum [136], from traditional 2D screens to augmented reality, and virtual reality (VR) visualizations. Application areas include automotive, aeronautics, logistics, production, education, cultural heritage, and others [39]. In AR, digital virtual images are placed on the real world, such that the digital content is adjusted with real world entities [9]. Contrarily, in VR, environments are entirely made up of 3D digital objects [206]. Mixed reality (MR) includes all potential positions on the RV continuum, i. e., between the real world and the virtual environment. Specifically, in the automotive domain, AR technology has been adopted for a variety of use cases. For example, head-up displays (HUDs) and larger windshield displays (WSDs) utilize AR technology to increase drivers’ situational awareness by directing their gaze on the road (e. g., [193]), and even going further by displaying work- or entertainment-related content for performing non-driving related tasks (NDRTs) for more highly automated vehicles in prototype studies (e. g., [169], [180], [189]).

The arrival of AR glasses such as the Microsoft HoloLens and Magic Leap One, among others, enables research of in-vehicle use of AR in a more immersive way [100], thereby allowing researches to prototype human-machine interaction concepts more realistically using these AR devices in both lab environments and field studies. Furthermore, VR head-mounted displays (HMDs) such as the HTC Vive and Oculus Quest, combined with interaction tracking technologies like Leap Motion for hand tracking and Ultraleap Stratos for haptics have the potential to create immersive experiences in fully simulated digital environments. These VR setups could be utilized to simulate AR content [57], [172].

Figure 1 
            AR papers utilized in automotive research, 2009–2020.
Figure 1

AR papers utilized in automotive research, 2009–2020.

Figure 2 
            AR application areas in automotive research, 2009–2020.
Figure 2

AR application areas in automotive research, 2009–2020.

In this paper, we present the results of the workshop [176] for intelligent vehicles with focus on the practical usage of AR as well as future developments of this technology in and around intelligent vehicles, categorized into opportunities, challenges, and practical use cases.

2 Literature Review

We reviewed scientific publications since 2009, and our analysis revealed that an increasing number of them use augmented reality technology for their user studies, as previous research has shown that AR has the potential to foster trust and safety in automated vehicles as well as increased user experience of driving, and providing drivers/passengers the opportunity to perform non-driving related tasks (e. g., [12], [18], [62], [98], [99], [197], [237]).

Since 2009, the number of scientific publications on augmented reality in driving research has strongly increased (see Figure 1). We searched for augmented reality applications used in automated vehicles on Google Scholar, ACM Digital Library and IEEE Xplore, using the search terms “augmented reality”, “driving”, “automated vehicle”, and “vehicle automation”, and found more than 150 papers for the further analysis.

Figure 2 gives an overview of selected augmented reality (AR) applications. AR is researched for infotainment and gamification concepts in automated driving, as well as displaying navigation hints and hazard indications. More recently, mobile work and well-being are researched topics in the AR domain.

Table 1

AR applications areas and corresponding literature.

Application Areas Reviewed Papers
Safety & Driver Assistance [4], [5], [21], [22], [25], [26], [29], [31], [42], [43], [50], [51], [52], [70], [71], [72], [77], [82], [83], [86], [88], [89], [90], [96], [105], [107], [108], [109], [112], [115], [116], [117], [118], [119], [122], [131], [138], [144], [154], [156], [160], [161], [163], [183], [184], [185], [194], [199], [202], [208], [215], [216], [221], [231], [232], [239], [240], [245]
Navigation & Routing [13], [17], [27], [45], [46], [81], [85], [92], [95], [102], [129], [132], [140], [150], [158], [191], [195], [196], [209], [213], [214], [229], [241], [243]
User Interface Design [16], [24], [47], [53], [54], [56], [60], [63], [74], [76], [78], [80], [91], [106], [110], [113], [114], [120], [133], [134], [135], [168], [173], [181], [200], [201], [203], [204], [205], [212], [226], [227], [242]
Interaction Modalities [19], [23], [101], [103], [104], [111], [165], [166], [170], [190], [207], [210], [217], [220], [224], [225]
Passenger Experiences [6], [37], [186], [222], [223]
External Human-Machine Interfaces [28], [33], [125], [145], [155], [211]
Trust & Acceptance [34], [59], [73], [149], [188], [233], [235], [236], [238]
Simulator & Motion Sickness [14], [84], [97], [141], [148]

We clustered the reviewed papers into a total of eight application areas which shows that there is a wide field for automotive HCI researchers investigating AR technology. Figure 2 shows the advancement of these application areas over time, and Table 1 displays the reviewed literature with respect to the individual application areas. Initial research with AR in the automotive domain was conducted for predominantly safety and driving assistance features, with navigation cues being also researched thoroughly. Recent research has been exploring passenger experiences, vulnerable road users (VRUs), such as pedestrians and cyclists, as well as trust/overtrust and acceptance issues related to automated vehicles. Furthermore, interaction concepts inside the vehicle and outside (external human-machine interfaces, eHMIs) are being investigated.

In the following, we present each application area in more detail. Section 2.1.1 covers safety aspects, Section 2.1.2 navigational aspects, Section 2.1.3 user interface (UI) design with focus on head-up and windshield displays, Section 2.1.4 interaction modalities in combination with AR interfaces, Section 2.1.5 covers passenger experiences, and Section 2.1.6 external HMIs as well as vulnerable road users (VRUs). Furthermore, we give an overview of trust and motion/simulator sickness in Sections 2.1.7 and 2.1.8, respectively. Finally, in Section 2.2, we discuss the findings of our literature review.

2.1 Augmented Reality Application Areas

2.1.1 Safety & Driver Assistance

The Safety category mostly covers research on manual driving (e. g., [77], [86], [90]). However, with mixed traffic situations, where manual and automated vehicles share the road, becoming soon reality, research efforts in this application area will have to increase for highly automated vehicles [107]. This includes research on teleoperation of AVs [19]. Virtual reality is currently used in safety research, as AR awareness/attention cues can be simulated [177]. Therefore, new opportunities will open up for AR in automotive safety research, including gamification concepts [193]. Utsumi et al. [215] investigate visual information for emergency behaviors, and different weather conditions are also being explored (e. g., [239]). Additionally, vulnerable road users are included in AR safety research, for example warnings for pedestrians [88], [89] and cyclists [161]. Furthermore, AR automotive research encompasses take-over performance and other safety aspects [137]. Particularly, the transition from non-driving related content visualized on AR displays to manual driving should further be explored. To this end, keeping the driver in the loop must be a precondition for AR interfaces in conditionally automated driving [115]. Additional research is conducted on world-fixed AR using head-up and windshield displays in VR simulations. In this case, AR displays are used to support and increase situation awareness without being a strain on the driver’s cognitive load [105], [240].

2.1.2 Navigation & Routing

The research reviewed in this application area is mostly pertaining to manual and up to level 2 of vehicle automation [129], [140]. With progress of HMD and CAVE technology, navigation and routing functions can be explored in a more immersive environment than with monitors. Higher levels of vehicle automation are also investigated [102], [195]. Navigational aids in combination with AR HUDs or WSDs could be beneficial for reducing distractions [209], [214], and mitigating motion sickness (covered in Section 2.1.8). Further, the advancement of AR technology can help to transition traditional, two-dimensional direction instructions towards world-fixed three-dimensional instructions that blend into the outside environment [45], [191], [213]. A future research direction for AVs would be to combine navigation tasks with passenger experiences and gamification (see Section 2.1.5). For example, Schroeter et al. [193] use AR to playfully make the driver aware of situational warnings, a concept which could be translated to navigation and routing.

2.1.3 User Interface Design: AR Head-up and Windshield Displays

Another application area of automotive AR is the rapid UI design and prototyping process that can be realized in immersive and cost-effective environments, as Goedicke et al. [58] and Riegler et al. [174] show. For example, Morra et al. [139], Riegler et al. [177], Charissis et al. [30], and Gerber et al. [56] use VR simulation to iteratively design AR HUDs or WSDs. Merenda et al. [134], [135] focus on the granularity and visual fidelity the virtual environment must have, as driving performance and gaze behavior are impacted by these parameters, among others. Similarly, Paredes et al. [152] propose that virtual experiences should be sophisticated even early on in the prototyping phase, making use of high-fidelity content. While previous research efforts on automotive UI/HMI design emphasized the visualization of AR warnings to the driver, more recent research reacts to higher levels of vehicle automation, and their prospects as well as challenges. For example, Riegler et al. [181] explored content types for potential drivers of conditionally/fully AVs, and where they should be positioned on a simulated windshield display, and what transparency levels are suitable. Moreover, screen-fixed and world-fixed content presentations are being explored [173]. Especially for SAE L3 and L4 vehicles, the provision of take-over requests must be visualized properly [113]. Personalization and customization should be investigated further, as the progress of more and more digital in-vehicle interfaces, and the concurrent reduction of hardware-intensive UIs (e. g., buttons, sliders) will enable novel tailored UIs to be designed for any user. The transfer from VR driving simulator studies to real-world settings should also be researched in more detail. Pettersson et al. [157] compared a VR driving simulator setup to a field study, and their findings show that while VR studies do not fully replace field studies, they can provide added value to driving simulators and desktop testing. Especially, VR can be applied for conveying the overall principles behind a user study. Further research topics in this application area include the design of NDRT interfaces (e. g., virtual UI element sizes and appearances, feedback design, etc.), AR content/app switching (i. e., switching from one WSD application to another), and handover methods, such as transitioning from content shown on a mobile phone to the windshield display, and vice versa.

2.1.4 Interaction Modalities

Future AR-enabled vehicles will need an array of modalities for interacting with their UIs. Extending more traditional buttons, knobs, sliders, and touch screens, novel user interfaces (e. g., touch, speech, gestures, handwriting, and vision) [146] are researched in the automotive context. Virtual reality (VR) driving simulation is often used as it fosters rapid prototyping and iterating through interaction designs [174], [192]. Increasingly, due to advancements in automated driving, research is conducted on driver-vehicle interaction focussed on NDRTs instead of the primary driving task, employing an AR interface. For example, Riegler et al. [170] explore gaze-based interactions with WSDs to control an AR infotainment system. Eye-tracking within VR simulations has also been utilized [124]. Riegler et al. [177] apply speech-based interactions to perform a NDRT in semi-automated vehicles. The literature review shows that many interaction modalities are being researched separately, and we encourage the employment of sensor and interaction fusion, i. e., combining multiple interaction modalities depending on the current situational context (e. g., time of day, work/leisure trip, entertainment/work activities, private/public transport). AR interfaces may also aid to give feedback on the success or error of given interactions, which is already applied in smart home appliances using conversational artificial intelligence [121]. In the areas of automotive context awareness and customization employing AR technology, we believe that research is still in the early stages, and research efforts should be increased. Particularly, the types of interaction suitable for the context-specific situation should be evaluated. For example, in shared public transports, large AR interface displays in combination with speech interaction might not be usable from a proxemics and privacy point of view, however, this might not be the case for private vehicles.

2.1.5 Passenger Experiences

AR technology can further be applied to explore in-vehicle passenger activities [186], [223]. Equivalent to windshield displays for drivers, side windows can also be adjusted as displays for visualizing content for side and rear-seat passengers. In particular, [222] used side windows for displaying entertainment apps and games intended to support younger passengers on longer journeys. Wang et al. [223] explored rear-seat passengers’ and commuters’ preferences for activities in AR, and found that AR HUDs for back-seat users can lead to increased road safety by decreasing distractions [223]. Moreover, AR can lead passengers inside autonomous vehicles [152]. As human operators/drivers may not be required in future automated driving scenarios, understandable guidance of passengers, especially in public transport, needs to be investigated. Cross-vehicle entertainment might also be a viable use case for AR experiences, as Arnedo et al. [6] found. In this context, gamification approaches that include the real outside environment combined with AR digital content in vehicular settings might be a novel concept for future social gaming. For highly automated driving, when drivers become passengers, this research area will be especially useful, as vehicles will be used as mobile entertainment or office platforms [190]. We therefore encourage the automotive research community to investigate comfort, anxiety, entertainment, and work-related functionalities, using AR interfaces for AVs.

2.1.6 External HMIs & Vulnerable Road Users

Several studies in automotive research were conducted with emphasis on external vehicle bodies as a design space [28], [33], [145]. The majority of research utilizes AR to explore external HMI (eHMI) concepts in order to display information to other users, such as pedestrians, passengers, and drivers of other vehicles. In mixed traffic situations, eHMIs have the ability to convey additional information, such as the driving state and intent, to other road users. In a simulator study, Tong et al. [211] investigated a communication interface between an AV and pedestrians by displaying warnings. Similarly, Chang et al. [28] and Nguyen et al. [145] explored different eHMI concepts, such as public displays, warning systems, and alerts in case of vehicle malfunctions. However, such eHMIs are not limited to read-only displays, they may be utilized for interaction purposes as well, including route navigation for pedestrians using gestures [7], [33]. External HMIs should further not only be explored for pedestrians, but also for other road users as well, such as cyclists. The overarching goal is to foster safe interactions, and establish connections between automated vehicles and vulnerable road users by creating a common language between them. Moreover, more research efforts should be put into the potential of VRUs wearing AR HMDs/glasses in future scenarios. This includes presenting visual cues on pedestrian’s AR glasses through communicating intents between the various road users.

2.1.7 Trust & Acceptance

More recently, an increasing number of research studies was carried out in mixed reality environments to explore trust in automated driving systems. Ha et al. [59] examine the effect of explanation types and perceived risk on trust in AVs. They found that simple feedback (e. g., descriptions of the vehicle’s tasks), resulted in more trust in AVs, while too much feedback had a negative effect on cognitive overload, and did not increase trust [59]. Djavadian et al. [41] investigate the acceptance of automated vehicles, and found that particularly heavy traffic situations have the potential to increase driver’s acceptance as the vehicle would automatically select faster and less congested routes. Additionally, trust and acceptance should not only be determined for drivers and passengers, but also from an outsider’s standpoint. For example, Hollaender et al. [67] look into information exchange between AVs and pedestrians, and their effect on trust. Primarily, crossing scenarios, and AVs conveying safety crossing information to pedestrians, are explored. Additionally, Colley et al. [34] investigate trust in AVs by evaluating pedestrians’ intent in order to calibrate driver’s trust. Challenges in this application area include the need to measure trust over longer periods of time using long-term studies [94], and the possibility of decreasing and subsequently re-building trust in automated vehicles. Consequently, overtrust in AVs must be researched as well, such that drivers of conditionally automated vehicles stay situationally aware, and AR interfaces could help establish alerting systems. Particularly, the level of system feedback needs to be calibrated [48].

2.1.8 Simulator & Motion Sickness

When carrying out user studies on automated driving in virtual and augmented reality environments, when active drivers transition to passive passengers, simulator and motion sickness should be evaluated [84]. Simulator sickness is caused by motion sickness due to visual discrepancies with the real world [167]. Motion sickness can arise when steady-state users are moving, or when there are lags between head movements and the visual display presentation [64]. Results of motion sickness encompass nausea, disorientation, headache, and general discomfort, among others [87]. McGill et al. [126], [127] explore the utilization of HMDs in vehicles worn by passengers. Further, McGill et al. [128] suggest that postural movements should be limited in order to reduce simulator sickness in case no positional tracking can be performed by the simulator. Sawabe et al. [187] use vection illusion to cause pseudo-acceleration which results in a reduction in the acceleration stimulus, helping passengers to counteract simulator and motion sickness. A visual solution is proposed by Hanau et al. [61] as they show acceleration cues to passengers, thus diminishing the sensory conflict between the perceived acceleration and the lack of corresponding visuals. Williams et al. [230] explore a multisensory setting, i. e., visuo-haptic feedback in driving simulation, and Lucas et al. [123] investigate seat vibrations for reducing simulator sickness. Their results reveal that feedback congruency and simulating road vibrations leads to greater immersion for the driver, and thus have a positive effect on simulator sickness. We see a wide range of proposed solutions to simulator/motion sickness being researched, not only from a visual AR viewpoint. It should be noted, that both physiological and subjective ratings (e. g., using the Simulator Sickness Questionnaire, or Presence Questionnaire) should be assessed and reported for user studies, and physiological measures [32] (e. g., heart rate variability, galvanic skin response) could further help to deduce simulator/motion sickness in driving simulations. Further, the study duration, visual fidelity, and other parameters should be considered in VR simulations, which have an impact on the level of immersion and presence [20].

2.2 Discussion of Findings

As Table 1 shows, the vast majority of research papers from 2009 to 2020 covered safety-relevant aspects using AR interfaces. More recently, safety aspects were combined with non-driving related tasks in the form of keeping the driver in the loop, and staying situationally aware. Therefore, safety aspects are further researched in particularly more highly automated driving scenarios, by directing the driver’s attention towards the outside environment using AR interfaces. AR navigation interfaces are also an established research field that continues to be explored further with new concepts for world-fixed AR guidance. AR HUDs and WSDs are commonly researched for AR interfaces as vehicles, however, as AR HMDs/glasses become lighter and increase their field of view, automotive HCI researchers should consider this medium as well. Many open questions to in-vehicle AR interfaces pertain to highlighting hazards, conveying uncertainty/feedback, as well as trust and acceptance. Additionally, interfaces for NDRTs are the focus of HCI research, with multiple modalities including gestures, gaze, speech, and olfactory interfaces being prototypically explored. We therefore encourage researchers to further investigate world-fixed AR for information presentation and feedback design (not necessarily warning cues in manual driving, as was shown in Section 2.1.1), but for non-driving related tasks in (conditionally) automated driving, and keeping the driver in the loop to take over manual control of the vehicle. This includes visual transitions and multimodal displays (e. g., auditory, olfactory, among others) in order to increase take-over performance. Passengers have often been neglected in automotive HCI research, however, with AR interfaces, this might change, as side windows could turn into infotainment displays, and the windshield might become a tool for collaborative work and entertainment in automated vehicles. Therefore, novel passenger experiences with AR interfaces, not just for “fun activities”, but also mitigating motion sickness, will require more research efforts. Outside the vehicle, eHMIs are of growing interest to researchers, as one of the factors contributing to the advancements of automated vehicles will be the acceptance from pedestrians and other road users. Standards or guidelines for such external interfaces need to be developed in order to avoid confusion of future users (e. g., many ways to signal safe passage for pedestrians might be confusing for them). Finally, calibrating trust, dealing with distrust, but also overtrust in automated vehicles, has been explored also in conjunction with AR interfaces, and conveying explainable AI. Context awareness that is influenced by the driver’s physiological metrics (e. g., stress), could be used to adapt (i. e., increase or decrease) system feedback and information display, and might lead to increasing levels of trust. However, longitudinal studies need to be carried out to better analyze trust over time, and how it can be calibrated with broad acceptance of drivers/passengers.

The literature review served as basis for the HCI workshop, as it gave us a wide range of AR application areas for mobility concepts, and subsequently helped identify research gaps. Therefore, in the following Section 3, we will give an overview of our conducted workshop, and categorize the brainstorming results into action points.

3 HCI Workshop Results

The 2021 Mensch und Computer conference [1] was accompanied by a workshop [details removed for blind review] with the purpose of finding a holistic approach of augmented reality technology to be used for vehicle drivers/passengers and other road users. Our workshop participants were at the time of the workshop active in academia, and industry (i. e., original equipment manufacturers or automotive suppliers).

Figure 3 shows our brainstorming and discussion results. The main focus of the workshop was to identify challenges, opportunities, and practical use cases for automotive AR technology. The workshop findings are presented in detail in the following. Section 3.1 covers AR applications for in-vehicle experiences, and Section 3.2 summarizes the workshop findings on AR usage for pedestrians, other road users, and remote operators.

Figure 3 
            HCI workshop brainstorming results.
Figure 3

HCI workshop brainstorming results.

3.1 Augmented Reality Applications for In-Vehicle Experiences

As our literature review shows, utilizing augmented reality technology has proven to benefit in-vehicle users such as drivers and passengers in prototype studies. In this section, we describe and discuss our workshop findings for in-vehicle AR usage, both stationary (e. g., using head-up and windshield displays) and mobile (e. g., using head-mounted displays or AR glasses).

3.1.1 Safety & Driver Assistance

The role of safety in automotive AR research was thoroughly discussed in our workshop. We found that the overarching goal of AR safety applications should be the support of human information processing. To that end, AR has the potential to minimize gaze diversion by projecting necessary information into the driver’s field of view, rather than the driver having to move their head to a center display, for example [201], thereby fostering situation awareness [153]. The driver must be aware at all times in which mode (automated vs. manual) the vehicle is operating, and thereby AR plays an important part in keeping the driver in the loop by visualizing the active mode [194]. Moreover, only relevant and necessary information should be visualized. This requires an understanding of the user’s and vehicle’s context. For example, highly automated vehicles (SAE level 3 and higher) could enable the “driver” to conduct work- and entertainment related activities, and vehicles with lower automation level could only allow to display short and small notifications, rather than content-rich applications [181]. Highlighting pedestrians or other hazards is an additional application area of AR, and requires both correct real-world positioning of the AR content, and being transparent about such detected traffic participants [25], and even gain the driver’s attention when depicting crash predictions or preventions [214]. Therefore, information about other traffic participants could be displayed in order to support cooperation between the individual entities. For example, in case of traffic jams, the AR system could visualize an emergency corridor, guiding the driver to the correct lane. Another example we came up with is the visualization of the end of a traffic jam in order to prepare the driver/passengers for halting the vehicle. Another safety aspect is not about what information should be displayed, but rather what medium for AR should be used. In most research papers, AR is realized/simulated as HUD or WSD, rather than through AR HMDs. However, the viability of HMDs for drivers should be investigated nonetheless. In that case, legal and safety aspects (for field studies) should be additionally considered. Further, longitudinal studies should be conducted to explore the role of perceived safety with AR interfaces, and the possibility to perform a behavioral adaptation of the interface, such as limiting the level of verbosity of visual information (e.g, displaying only emergency information vs. displaying all detected objects on the road environment) [11].

3.1.2 Navigation

Another prominent feature of automotive AR is the ability of more immersive navigation instructions, which are particularly useful for lower levels of vehicle automation, such as level 2 and lower. World-fixed AR can visualize the proposed path for the vehicle to follow blended seamlessly into the environment [81]. Further, road regulations and traffic signs can be highlighted, therefore guiding the driver with the most relevant information through traffic [2]. For higher levels of vehicle automation, detecting front vehicles, and visualizing content that “follows” along, can have the benefit of reducing simulator sickness and increasing situational awareness, by avoiding to change the visual focus of the driver’s eyes between the real world and the digital content on the HUD/WSD. Overall, the consensus of the workshop participants was that for lower levels of vehicle automation, navigation instructions in AR can enhance the manual driving experience, while for more highly automated vehicles, motion sickness should be researched more extensively in combination with navigation visualizations.

3.1.3 Trust & Acceptance

Our workshop participants also identified trust as an important research topic, as two emerging technologies intersect in potential future mobility solutions: automated vehicles and augmented reality. New technologies need to build up their users’ trust towards them [234]. While technologically affine people might welcome revolutionary changes in vehicles, other drivers should not be left behind. Therefore, novel technology might be seen as a hindrance rather than a catalyst for improved mobility. AR may play a vital role in creating trust and acceptance in automated vehicles, as the vehicle’s “thinking and doing” can be visualized, such as why certain road maneuvers were made by highlighting obstacles in the HUD/WSD. This level of transparency can be a form of explainable artificial intelligence (XAI) [235]. On the other side of the spectrum, overtrust should be addressed as well. Relying too much on novel technology could lead to a false sense of comfort in automated vehicles, and the driver should be made aware of the technological feasibility and deficiencies. In that regard, trust must be calibrated to the individual user’s attitude towards automated driving, and we believe that AR can visually show to the driver what the vehicle’s sensors “see”, and therefore give the driver an idea of the vehicle’s technological scope.

3.1.4 Content Presentation & AR Visualization

One opportunity of AR interfaces is to re-define traditional content presentations, for example by using a head-mounted display or a large 3D AR windshield display. Compared to dashboard screens or center tablets, windshield displays offer a vastly larger screen space, which can be utilized for multiple users. However, in order for world-fixed, or contact-analog, visualization to work, the outside road environment must be detected, e. g., using optical cameras, or Lidar sensors, resulting in point clouds, or areas that AR apps can utilize to place digital objects. This enables users (driver, passengers) to experience novel, richer information presentations based on real-world information (points of interest, other road users, etc.). Information pertaining to “real-world” entities, such as front vehicles, or pedestrians, might be better understandable when highlighted in AR rather than presenting warnings on a traditional screen. However, as drivers are not comprised of young, technologically affine users, all target groups must be considered, including elderly drivers, and their attitude towards new interfaces [178]. Further, users must be able to understand the presented information on AR displays, and it might be helpful to initially guide users through the AR interface with onboarding [36]. Further, AR-only interfaces might not be suitable for certain users, and automotive UI/UX designers might need to design for traditional UIs, as well as AR interfaces. Another challenge regarding AR interfaces is the context in which information with different levels of priority need to be visualized. For example, there is safety-relevant driving information, and side-activity information side-by-side on an AR HUD/WSD, yet, the driver must differentiate between them, or the system has to decide what granularity of information is best displayed.

When presented with large screens, we must consider the possibility of visual overload and distraction caused by visual clutter, i. e., too many digital objects placed in the scene. This abundance of information could then lead to cognitive overload, thereby making the driver feel stressed, and reducing situational awareness [48]. It is therefore necessary to only display necessary information at the time when the driver’s awareness is needed, e. g., for take-over requests in semi-automated driving [10]. It might be advantageous to blend the virtual information into the real environment using AR interfaces, as the driver would not have to switch focus because of different distances (real world, digital information), as is the case with screen-fixed displays, e. g., dashboard screens. Moreover, the digital content should not obstruct the real world by being displayed too opaquely, i. e., the transparency of the content should be adjusted to the real environment [171].

3.1.5 Control System

We clustered brainstorming ideas pertaining to interaction with and control of an AR interface under the term control system. Similar to novel finger gestures, such as pinch-to-zoom, introduced with multi-touch enabled smartphones [65], interaction paradigms with AR interfaces must be learned. Therefore, onboarding processes will be required to get drivers/passengers familiar with operating non-traditional 3D displays. Additionally, all target and age groups need to be considered. We strongly believe that the advancements of automated driving will be coupled with the ubiquity of AR interfaces, as more and more tasks might shift towards novel non-driving related experiences in vehicles, making way for work and well-being activities [190]. In that regard, metaphors from the operation of smartphones might be helpful and considered intuitive for automotive users (e. g., [15], [182]). Subsequently, we considered how interactions should be performed with AR interfaces. As vehicle automation progresses, the driver will move further away from the steering wheel, and pressing haptic buttons might not be the most efficient or pleasant way to navigate through AR content. Therefore, a multimodal approach, considering finger/hand gestures, speech, and gaze input, and combinations thereof, should be explored. For real world-fixed, or contact-analog AR, eye tracking is required, and would therefore make an interesting way to interact [169]. However, physical controls should not be dismissed, as they are already well-known, and if placed on easily reachable locations (e. g., integrated into the driver’s/passenger’s rotatable seat), should be explored as well. Conversational UIs [93] can further be realized with AR interfaces, as they allow for a more user-centered human input, potentially combining voice assistance and immersive visualizations [159]. Furthermore, the transition from opaque 2D screens towards semi-transparent 3D AR displays will also require further attention by researchers and UI designers, as users are not yet familiar with these interfaces.

3.1.6 Personalization & Customization

One of the main opportunities of AR interfaces is the potential to modify the content more easily as compared to more traditional dashboard/center stack/steering wheel haptic interfaces. While large tablet displays also open the possibility to adapt the graphical user interface, AR HUDs/WSDs might have the edge as they direct the driver’s/passengers’ attention towards the road environment, thereby potentially increasing situational awareness, and mitigating motion sickness [173]. We further identified that AR display layouts should be divided into at least 2 groups, comprised of adaptable and customizable layout elements, and non-customizable and customizable layout elements. Different levels of customization should be introduced, such as binary modes (e. g., light or dark UI skins), finite modes (e. g., discrete volume steps between a minimum and maximum value), and continuous values (e. g., colors). Personalization should therefore be possible, depending on the functionality to customize, and the level of vehicle automation [78], [179]. However, an adequate “one-size-fits-all” AR interface should be explored and evaluated. While future mobility solutions might prioritize in-vehicle experiences, customizable user experience (UX) and user preferences, however, do not necessarily imply better or safer interfaces. Therefore, a holistic approach on automotive AR interface design coupled with safety aspects should be investigated.

3.1.7 Driver Improvements & Gamification

AR interfaces may further enable drivers to improve their driving skills. For example, gamification concepts could be applied to both aid drivers increase road safety and drive environmentally friendly. Misbehavior, driving violations etc. could be instantaneously displayed on the AR HUD/WSD for the drivers to learn from their mistakes. Gamification interfaces and social features could be used to “rank” drivers and compare their driving skills among other drivers or friends. The use of AR interfaces for driving schools has the potential benefit of playfully instructing learners on improving their driving proficiency [3].

3.1.8 Feedback Strategies

Another main category of our workshop findings is the opportunity of AR interfaces to give visual feedback to the driver, and in case the feedback is location-based, the interface can visually highlight that information. Feedback is a core component for facilitating trust [233], but also to make explainable AI visually tangible. An example would be as to why a certain vehicle behavior or maneuver was performed. The level of feedback needs to be determined, as too much feedback might cognitively overwhelm the driver, while not enough feedback might reduce the driver’s trust in the vehicle. An open question remains whether the vehicle should only provide feedback on what the vehicle can control, or the driver’s performance (e. g., compliant driving, environmentally-friendly driving), or both. Additionally, the level of feedback might be interesting for the driver to customize. For example, should the AR interface show what the vehicle “sees and senses”, “thinks”, and/or the resulting action (cf. Tesla Autopilot’s 3D mapping of individual vehicles, cf. [79])?

3.1.9 Entertainment & Work

One of the main benefits of automated driving is the ability to engage in non-driving related tasks (NDRTs), such as working or media consumption [142]. The shift from information displays nowadays to entertainment and experience displays in the future leads to new opportunities and challenges for automotive UI/UX designers. AR interfaces introduce new display and interaction paradigms in a transition away from traditional 2D screens towards immersive 3D continuous-depth displays [78]. While the immediate use case for AR HUDs/WSDs opens up for drivers (cf. safety and navigation aspects), passengers could also benefit from AR interfaces implemented on the side windows (cf. display of external information) [222]. Similar to the mobile app marketplaces today, we envision automotive app stores with developer-driven apps (in addition to OEM apps) focussing on experiences based on the different levels of vehicle automation, and placed in different app categories (e. g., games, safety, navigation, news etc.). Consequent to mobile app HCI and design guidelines, it would be adequate to start developing automotive app HCI and design guidelines for AR interfaces. As smartphone and tablet screens are comparatively small vs. windshield displays, a new design space [228] and inherent guidelines are needed for automotive UI/UX designers and developers, creating novel experiences for driver/passenger NDRTs.

3.1.10 Display of External Information

First, AR allows the placement of points of interest (POIs) in the driver’s or passengers’ field of view, thereby enabling them to process geospatial information blended into the environment [246]. Application areas for displaying POIs are beyond navigation [92], and include context-aware information such as restaurants and local businesses [151]. Furthermore, AR applications can display user-specific content relating to the outside road environment. For example, police officers could utilize AR windshield displays to display additional data about vehicles in the field of view, such as their speed, and license information. Of course, privacy issues might arise that will eventually need to be addressed [68].

3.1.11 Hardware

One of the biggest potential advantages of in-car AR is the enabling of minimalist UI. As such, visual clutter on the dashboard, center stack, and steering wheel with partly redundant functionality could be eliminated [181]. Notable challenges with in-vehicle AR pertain to hardware aspects. As such, vehicle-to-vehicle communication must be extended in order to reduce reliance on standalone sensors, e. g., for detecting other road users [162]. Further, 3D maps for navigation must be kept updated, in particular to facilitate world-relative navigation instructions that blend seamlessly into the environment [130]. For that reason, hardware is not only important inside the vehicle, but also reliable and secure backends need to be established, e. g., by OEMs or service providers to ensure that high-quality content is streamed to the vehicle. In case of AR HUDs and WSDs, eye box restrictions need to be accounted for. Currently, the position/placement of HUD content needs to be manually adjusted by the driver, however, with eye tracking devices being used for driver monitoring [38], [198], an additional use case for them would be to automatically adjust the AR content according to the driver’s head position and rotation. The use of AR HUDs/WSDs as single point of information would further advance the opportunity to eliminate or reduce physical materials for buttons, knobs, displays (which are sometimes redundant or rarely used), and their resource consumption, resulting in an ecological benefit. Only the needed settings/user-defined apps will be used, leading to a more minimalist user interface. When it comes to integrating digital objects with the real world, in a moving environment, one must further consider latency issues. If digital objects cannot be rendered and visualized in a reasonable timeframe, the overlay will become confusing for the driver. Additionally, while HUDs are designed for limited additional visual input for drivers, WSDs allow for a large design space. Field of view restrictions must therefore be overcome for further advances in large AR content spaces [75], [244]. This is not only valid for stationary AR projections on the HUD/WSD, but also for mobile HMDs. In the latter case, the increased weight needs to be considered in combination with driver fatigue [69].

3.2 Augmented Reality Applications for Pedestrians and Other Road Users

While not the core focus of our workshop, we also discussed the potential of AR technology outside of vehicles being used to interact with them. For example, the potential of smartphone AR as well as head-worn AR glasses to warn pedestrians about approaching cars. Our main findings during the brainstorming sessions for the usage of AR outside of vehicles are related to the application areas of safety and navigation.

3.2.1 AR Worn by Vulnerable Road Users

The use of AR apps must be done with caution not only for in-vehicle experiences [44], but also from a pedestrian point of view [8]. Distractions caused by smartphone AR games, such as Pokémon GO, where players are incentivized for increased physical activity, lead to safety issues that need to be addressed. For example, the AR game could display warning messages or visualize the source of danger (e. g., an approaching vehicle) to the smartphone user to keep them in the situational awareness loop [164]. Head-worn AR is also of increasing interest to academia and industry, for example in the form of bicycle or motorcycle helmet-integrated AR. First concepts (e. g., [218], [219]) shows that new opportunities for AR designers and developers are opening up in the VRU AR design space. While initial concepts focus on safety and navigation features, similar to the automotive domain initially, we encourage researchers to explore new application areas.

3.2.2 AR on the Vehicle Used by Other Road Users

External car displays provide a novel design space for outside functionalities vehicles can provide to pedestrians and other road users. For example, environment explorations and navigation instructions can be facilitated with external AR displays [33]. Additionally, pedestrians crossing behaviors can be influenced using such displays [49], [66]. Roadside maintenance can further be simplified by providing instructions on external displays, and can play an important part in building a cooperative vehicle infrastructure system [40].

3.2.3 AR/VR Used by Remote Operators

Another use case for AR/VR is the immersive visualization for remotely controlling, or so-called teleoperating, vehicles. In that scenario, the remote operator could utilize VR to steer the vehicle, while the in-vehicle passengers see AR feedback on the HUD/WSD, thereby establishing and fostering a collaboration with the remote operator [55], [143]. Moreover, smartphone AR can be employed to transmit a live camera feed to the remote operator, and both the smartphone and remote operator can mark areas in the live feed needing to be investigated by the on-site operator. This form of AR collaboration is already prototypically applied for remote assistance in industrial repairs, and shows great potential (e. g., [147]).

4 Discussion and Outlook

Automotive HCI researchers need to investigate augmented reality technology in the context of automated driving further. There are still open challenges to be solved before a broad adoption of AR technology in vehicles can be realized. For example, the use of AR HMDs vs. large HUDs/WSDs should be explored. Further, topics like context and situation awareness, information relevancy as well as AR content perception and customization are research topics of interest. In addition, further interaction modalities transitioning from haptic button/tablet interfaces should be researched (e. g., finger/hand gestures, speech commands, gaze-based interactions). Additionally, communication strategies in the exterior of vehicles, i. e., with pedestrians and other road users, need to be defined. The technological advancements also extends the use cases of augmented reality applications with implications on new human factors research fields, e. g., passenger experiences with side-windows AR content, or entertainment and work experiences being enhanced with 3D AR interfaces.

Our HCI workshop and literature review have several implications for future augmented reality applications for future mobility. Based on the presented opportunities, challenges, and practical usages, we recommend the following overarching research topics for HCI researchers and AR designers in the automotive domain:

  1. AR HUD vs. HMD. It is important to investigate different types of AR devices (HMD, WSD, smartphone/tablet-based AR) depending on different users (driver, passenger, pedestrian, cyclist) and use cases (e. g., driving, working, entertainment). In any case, the introduction of AR creates the opportunity to shift from traditional haptic interfaces to application-specific augmented/virtual interfaces.

  2. Real Augmented Reality. World-fixed, or contact-analog, AR as opposed to simply overlaying digital content on a HUD opens up the prospect to transition from 2D screens to novel 3D continuous-depth contents.

  3. Onboarding Processes need to be devised as drivers may not inherently understand AR interfaces, and how to interact with automated vehicles. Therefore, guiding new users onto automated vehicles using AR should be explored, and could help build initial trust towards AVs.

  4. Designing for different Users. New technologies are often designed for young, technologically proficient users, however, with vehicles being used by people of all age groups, inclusive interfaces should be designed. AR displays, and corresponding interactions, must be developed with the involvement of representative users. As such, multimodal interfaces could help in bringing inclusiveness to the vehicle.

  5. Applications beyond Safety and Navigation. With level 3 AVs being introduced onto real roads, NDRTs can be designed in a way such that safety aspects are included. Therefore, safety does not become a standalone feature, but is integrated into non-driving activities. Ideally, these safety features would become invisible to the driver and eventually ubiquitous (similar to health-related safety features in wearable smart devices).

  6. Motion Sickness. With AR content being consumed by the driver and passengers, motion sickness in AVs should be kept to a minimum. As such, feedback strategies should be devised that allow vehicle occupants to mentally prepare for the road environment, such as conveying to them where is the car going.

  7. AR Applications outside the Vehicle. External vehicle interfaces open up the prospect for novel interactions, both safety and non-safety related, experienced by other road users, such as pedestrians and cyclists.

5 Conclusion

Automotive augmented reality applications will fundamentally change driver and passenger experiences as well as external experiences for other road users and pedestrians. Usable AR display technology for various industrial and home entertainment applications is already available, and prototypes of AR head-up displays showcased by automotive manufacturers and suppliers in combination with semi-automated vehicles may foster further research efforts towards commercially available systems within the next years. Novel AR applications will have to account for a wide range of perceptual and cognitive issues within the (semi-automated) vehicle design space. We presented and discussed application areas, challenges, opportunities, and practical use cases based on a literature review as well as an HCI workshop with participants from industry and academia, and concluded by giving recommendations for potential future frontiers. More interdisciplinary research at the intersection of AR technology, HCI, UI/UX, and psychology is required to enable safe and practical AR applications for future mobility solutions.

Funding statement: This work was supported by the University of Applied Sciences PhD program and research subsidies granted by the government of Upper Austria.

About the authors

Andreas Riegler

Andreas Riegler is a researcher and principal investigator at the research group Mobile Interactive Systems (MINT) of the University of Applied Sciences Upper Austria. His research interests include Automotive Computing, Mobile Computing and Human-Computer Interaction. He is currently researching the use of Augmented and Virtual Reality in the context of automated driving.

Andreas Riener

Andreas Riener is a professor for Human-Machine Interaction and Virtual Reality at Technische Hochschule Ingolstadt (THI), Germany with co-appointment at CARISSMA (Center of Automotive Research on Integrated Safety Systems and Measurement Area). He is further leading the Human-Computer Interaction Group (HCIG) at THI. His research interests include driving ergonomics, driver state estimation from physiological measures, human factors in driver-vehicle interfaces as well as topics related to (over)trust, acceptance, and ethical issues in automated driving.

Clemens Holzmann

Clemens Holzmann is a full professor at the School of Informatics, Communications and Media of the University of Applied Sciences Upper Austria. Since 2018, he is also vice president for IT of the University of Applied Sciences Upper Austria. His research interests include Mobile Computing and Human-Computer Interaction. He is further leading the research group Mobile Interactive Systems (MINT) at University of Applied Sciences Upper Austria.

References

[1] 2021. MuC’21: Mensch und Computer 2021. Association for Computing Machinery, New York, NY, USA.Search in Google Scholar

[2] Lotfi Abdi, Faten Ben Abdallah, and Aref Meddeb. 2015. In-Vehicle Augmented Reality Traffic Information System: A New Type of Communication Between Driver and Vehicle. Procedia Computer Science 73 (2015), 242–249. DOI:http://dx.doi.org/10.1016/j.procs.2015.12.024.10.1016/j.procs.2015.12.024Search in Google Scholar

[3] Lotfi Abdi and Aref Meddeb. 2017. In-vehicle augmented reality system to provide driving safety information. Journal of Visualization 21, 1 (2017), 163–184. http://dx.doi.org/10.1007/s12650-017-0442-6.10.1007/s12650-017-0442-6Search in Google Scholar

[4] Patricia RJA Alves, Joel Goncalves, Rosaldo JF Rossetti, Eugenio C Oliveira, and Cristina Olaverri-Monreal. 2013. Forward collision warning systems using heads-up displays: Testing usability of two new metaphors. In 2013 IEEE Intelligent Vehicles Symposium Workshops (IV Workshops). IEEE, 1–6. DOI:http://dx.doi.org/10.1109/ivworkshops.2013.6615217.10.1109/IVWorkshops.2013.6615217Search in Google Scholar

[5] Emma van Amersfoorth, Lotte Roefs, Quinta Bonekamp, Laurent Schuermans, and Bastian Pfleging. 2019. Increasing driver awareness through translucency on windshield displays. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings. 156–160. DOI:http://dx.doi.org/10.1145/3349263.3351911.10.1145/3349263.3351911Search in Google Scholar

[6] Joan Arnedo, Lennart E Nacke, Vero Vanden Abeele, ZO Toups, Matthew Lakier, Lennart E Nacke, Takeo Igarashi, and Daniel Vogel. 2019. Cross-Car, Multiplayer Games for Semi-Autonomous Driving. In Proceedings of the Annual Symposium on Computer-Human Interaction in Play. 467–480. DOI:http://dx.doi.org/10.1145/3311350.3347166.10.1145/3311350.3347166Search in Google Scholar

[7] Ashratuz Zavin Asha, Fahim Anzum, Patrick Finn, Ehud Sharlin, and Mario Costa Sousa. 2020. Designing External Automotive Displays: VR Prototypes and Analysis. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 74–82. DOI:http://dx.doi.org/10.1145/3409120.3410658.10.1145/3409120.3410658Search in Google Scholar

[8] John W Ayers, Eric C Leas, Mark Dredze, Jon-Patrick Allem, Jurek G Grabowski, and Linda Hill. 2016. Pokémon GO – a new distraction for drivers and pedestrians. JAMA internal medicine 176, 12 (2016), 1865–1866.10.1001/jamainternmed.2016.6274Search in Google Scholar PubMed

[9] Ronald T Azuma. 1997. A survey of augmented reality. Presence: Teleoperators & Virtual Environments 6, 4 (1997), 355–385.10.1162/pres.1997.6.4.355Search in Google Scholar

[10] Mohammad Bahram, Michael Aeberhard, and Dirk Wollherr. 2015. Please take over! An analysis and strategy for a driver take over request during autonomous driving. In 2015 IEEE intelligent vehicles symposium (IV). IEEE, 913–919.10.1109/IVS.2015.7225801Search in Google Scholar

[11] Marcel CA Baltzer, Christian Lassen, Daniel López, and Frank Flemisch. 2018. Behaviour adaptation using interaction patterns with augmented reality elements. In International Conference on Augmented Cognition. Springer, 9–23.10.1007/978-3-319-91470-1_2Search in Google Scholar

[12] Karlin Bark, Cuong Tran, Kikuo Fujimura, and Victor Ng-Thow-Hing. 2014a. Personal Navi: Benefits of an augmented reality navigational aid using a see-thru 3D volumetric HUD. In Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. ACM, 1–8.Search in Google Scholar

[13] Karlin Bark, Cuong Tran, Kikuo Fujimura, and Victor Ng-Thow-Hing. 2014b. Personal navi: Benefits of an augmented reality navigational aid using a see-thru 3d volumetric hud. In Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 1–8. DOI:http://dx.doi.org/10.1145/2667317.2667329.10.1145/2667317.2667329Search in Google Scholar

[14] Tobias M Benz, Bernhard Riedl, and Lewis L Chuang. 2019. Projection Displays Induce Less Simulator Sickness than Head-Mounted Displays in a Real Vehicle Driving Simulator. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 379–387. DOI:http://dx.doi.org/10.1145/3342197.3344515.10.1145/3342197.3344515Search in Google Scholar

[15] Mark Billinghurst, Hirokazu Kato, and Seiko Myojin. 2009. Advanced interaction techniques for augmented reality applications. In International Conference on Virtual and Mixed Reality. Springer, 13–22.10.1007/978-3-642-02771-0_2Search in Google Scholar

[16] Anna Bolder, Stefan M Grünvogel, and Emanuel Angelescu. 2018. Comparison of the usability of a car infotainment system in a mixed reality environment and in a real car. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology. 1–10. DOI:http://dx.doi.org/10.1145/3281505.3281512.10.1145/3281505.3281512Search in Google Scholar

[17] Adam Bolton, Gary Burnett, and David R Large. 2015a. An investigation of augmented reality presentations of landmark-based navigation using a head-up display. In Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 56–63. DOI:http://dx.doi.org/10.1145/2799250.2799253.Search in Google Scholar

[18] Adam Bolton, Gary Burnett, and David R Large. 2015b. An investigation of augmented reality presentations of landmark-based navigation using a head-up display. In Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. ACM, 56–63.10.1145/2799250.2799253Search in Google Scholar

[19] Martijn Bout, Anna Pernestål Brenden, Maria Klingegård, Azra Habibovic, and Marc-Philipp Böckle. 2017. A Head-Mounted Display to Support Teleoperations of Shared Automated Vehicles. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct. 62–66. DOI:http://dx.doi.org/10.1145/3131726.3131758.10.1145/3131726.3131758Search in Google Scholar

[20] Doug A Bowman and Ryan P McMahan. 2007. Virtual reality: how much immersion is enough? Computer 40, 7 (2007), 36–43.10.1109/MC.2007.257Search in Google Scholar

[21] Efe Bozkir, David Geisler, and Enkelejda Kasneci. 2019. Assessment of Driver Attention during a Safety Critical Situation in VR to Generate VR-based Training. In ACM Symposium on Applied Perception 2019. 1–5. DOI:http://dx.doi.org/10.1145/3343036.3343138.10.1145/3343036.3343138Search in Google Scholar

[22] KF Bram-Larbi, V Charissis, S Khan, R Lagoo, DK Harrison, and D Drikakis. 2020. Collision Avoidance Head-Up Display: Design Considerations for Emergency Services’ Vehicles. In 2020 IEEE International Conference on Consumer Electronics (ICCE). 1–7. DOI:http://dx.doi.org/10.1109/icce46568.2020.9043068.10.1109/ICCE46568.2020.9043068Search in Google Scholar

[23] Daniel Brand, Kevin Büchele, and Alexander Meschtscherjakov. 2016. Pointing at the HUD. In Adjunct Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 167–172. DOI:http://dx.doi.org/10.1145/3004323.3004343.10.1145/3004323.3004343Search in Google Scholar

[24] Nora Broy, Albrecht Schmid, Florian Alt, Simone Höckh, Annette Frederiksen, Michael Gilowski, Julian Eichhorn, Felix Naser, Horst Jung, Julia Niemann, and Martin Schell. 2014. Exploring Design Parameters for a 3D Head-Up Display. In Proceedings of The International Symposium on Pervasive Displays. 38–43. DOI:http://dx.doi.org/10.1145/2611009.2611011.10.1145/2611009.2611011Search in Google Scholar

[25] Alessandro Calvi, Fabrizio D’Amico, Chiara Ferrante, and Luca Bianchini Ciampoli. 2020a. Effectiveness of augmented reality warnings on driving behaviour whilst approaching pedestrian crossings: A driving simulator study. Accident Analysis & Prevention 147 (2020), 105760. DOI:http://dx.doi.org/10.1016/j.aap.2020.105760.10.1016/j.aap.2020.105760Search in Google Scholar PubMed

[26] Alessandro Calvi, Fabrizio D’Amico, Chiara Ferrante, and Luca Bianchini Ciampoli. 2020b. Evaluation of augmented reality cues to improve the safety of left-turn maneuvers in a connected environment: A driving simulator study. Accident Analysis & Prevention 148 (2020), 105793. DOI:http://dx.doi.org/10.1016/j.aap.2020.105793.10.1016/j.aap.2020.105793Search in Google Scholar PubMed

[27] Chu Cao, Zhenjiang Li, Pengfei Zhou, and Mo Li. 2018. Amateur. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2, 4 (2018), 1–24. DOI:http://dx.doi.org/10.1145/3287033.10.1145/3287033Search in Google Scholar

[28] Chia-Ming Chang, Koki Toda, Daisuke Sakamoto, and Takeo Igarashi. 2017. Eyes on a Car. In Proceedings of the 9th international conference on automotive user interfaces and interactive vehicular applications. 65–73. DOI:http://dx.doi.org/10.1145/3122986.3122989.10.1145/3122986.3122989Search in Google Scholar

[29] Vassilis Charissis. 2014. Enhancing human responses through augmented reality Head-Up Display in vehicular environment. In 2014 11th International Conference & Expo on Emerging Technologies for a Smarter World (CEWIT). IEEE, 1–6. DOI:http://dx.doi.org/10.1109/cewit.2014.7021141.10.1109/CEWIT.2014.7021141Search in Google Scholar

[30] Vassilis Charissis, Stylianos Papanastasiou, Warren Chan, and Evtim Peytchev. 2013. Evolution of a full-windshield HUD designed for current VANET communication standards. In 2013 16th International IEEE Conference on Intelligent Transportation Systems (ITSC). IEEE, 1637–1643. DOI:http://dx.doi.org/10.1109/itsc.2013.6728464.10.1109/ITSC.2013.6728464Search in Google Scholar

[31] Bo-Hao Chen, Shih-Chia Huang, and Wei-Ho Tsai. 2017. Eliminating Driving Distractions: Human-Computer Interaction with Built-In Applications. IEEE Vehicular Technology Magazine 12, 1 (2017), 20–29. DOI:http://dx.doi.org/10.1109/mvt.2016.2625331.10.1109/MVT.2016.2625331Search in Google Scholar

[32] Sherrilene Classen, Megan Bewernitz, and Orit Shechtman. 2011. Driving simulator sickness: an evidence-based review of the literature. American journal of occupational therapy 65, 2 (2011), 179–188.10.5014/ajot.2011.000802Search in Google Scholar PubMed

[33] Ashley Colley, Jonna Häkkilä, Meri-Tuulia Forsman, Bastian Pfleging, and Florian Alt. 2018. Car Exterior Surface Displays. In Proceedings of the 7th ACM International Symposium on Pervasive Displays. 1–8. DOI:http://dx.doi.org/10.1145/3205873.3205880.10.1145/3205873.3205880Search in Google Scholar

[34] Mark Colley, Christian Bräuner, Mirjam Lanzer, Marcel Walch, Martin Baumann, and Enrico Rukzio. 2020. Effect of Visualization of Pedestrian Intention Recognition on Trust and Cognitive Load. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 181–191. DOI:http://dx.doi.org/10.1145/3409120.3410648.10.1145/3409120.3410648Search in Google Scholar

[35] SAE On-Road Automated Vehicle Standards Committee. 2021. Taxonomy and definitions for terms related to on-road motor vehicle automated driving systems. (2021).Search in Google Scholar

[36] Henrik Detjen, Robert Niklas Degenhart, Stefan Schneegass, and Stefan Geisler. 2021. Supporting User Onboarding in Automated Vehicles through Multimodal Augmented Reality Tutorials. Multimodal Technologies and Interaction 5, 5 (2021), 22.10.3390/mti5050022Search in Google Scholar

[37] Henrik Detjen, Stefan Geisler, and Stefan Schneegass. 2020. “Help, Accident Ahead!”: Using Mixed Reality Environments in Automated Vehicles to Support Occupants After Passive Accident Experiences. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 58–61. DOI:http://dx.doi.org/10.1145/3409251.3411723.10.1145/3409251.3411723Search in Google Scholar

[38] Mandalapu Sarada Devi and Preeti R Bajaj. 2008. Driver fatigue detection based on eye tracking. In 2008 First International Conference on Emerging Trends in Engineering and Technology. IEEE, 649–652.Search in Google Scholar

[39] Arindam Dey, Mark Billinghurst, Robert W Lindeman, and J Swan. 2018. A systematic review of 10 years of augmented reality usability studies: 2005 to 2014. Frontiers in Robotics and AI 5 (2018), 37.10.3389/frobt.2018.00037Search in Google Scholar PubMed PubMed Central

[40] Zai Ming Ding. 2014. A design of Cooperative Vehicle Infrastructure System Based on Internet of Vehicle Technologies. In Applied Mechanics and Materials, Vol. 552. Trans Tech Publ, 363–366.10.4028/www.scientific.net/AMM.552.363Search in Google Scholar

[41] Shadi Djavadian, Bilal Farooq, Rafael Vasquez, and Grace Yip. 2020. Virtual immersive reality based analysis of behavioural responses in connected and autonomous vehicle environment. In Mapping the Travel Behavior Genome. Elsevier, 543–559. DOI:http://dx.doi.org/10.1016/b978-0-12-817340-4.00027-9.10.1016/B978-0-12-817340-4.00027-9Search in Google Scholar

[42] A Doshi, Shinko Yuanhsien Cheng, and MM Trivedi. 2009. A Novel Active Heads-Up Display for Driver Assistance. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 39, 1 (2009), 85–93. DOI:http://dx.doi.org/10.1109/tsmcb.2008.923527.10.1109/TSMCB.2008.923527Search in Google Scholar PubMed

[43] Robin Eyraud, Elisabetta Zibetti, and Thierry Baccino. 2015. Allocation of visual attention while driving with simulated augmented reality. Transportation Research Part F: Traffic Psychology and Behaviour 32 (2015), 46–55. DOI:http://dx.doi.org/10.1016/j.trf.2015.04.011.10.1016/j.trf.2015.04.011Search in Google Scholar

[44] Mara Faccio and John J McConnell. 2020. Death by Pokémon GO: The economic and human cost of using apps while driving. Journal of Risk and Insurance 87, 3 (2020), 815–849.10.1111/jori.12301Search in Google Scholar

[45] Nayara de Oliveira Faria, Dina Kandil, and Joseph L Gabbard. 2019. Augmented reality head-up displays effect on drivers’ spatial knowledge acquisition. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 63, 1 (2019), 1486–1487. DOI:http://dx.doi.org/10.1177/1071181319631287.10.1177/1071181319631287Search in Google Scholar

[46] Alexander Feierle, David Beller, and Klaus Bengler. 2019. Head-Up Displays in Urban Partially Automated Driving: Effects of Using Augmented Reality. In 2019 IEEE Intelligent Transportation Systems Conference (ITSC). IEEE, 1877–1882. DOI:http://dx.doi.org/10.1109/itsc.2019.8917472.10.1109/ITSC.2019.8917472Search in Google Scholar

[47] Nadia Fereydooni, Orit Shaer, and Andrew L Kun. 2019. Switching between augmented reality and a manual-visual task. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings. 99–103. DOI:http://dx.doi.org/10.1145/3349263.3351502.10.1145/3349263.3351502Search in Google Scholar

[48] George Filip, Xiaolin Meng, Gary Burnett, and Catherine Harvey. 2016. Designing and calibrating trust through situational awareness of the vehicle (SAV) feedback. (2016).10.1049/cp.2016.1171Search in Google Scholar

[49] Lex Fridman, Bruce Mehler, Lei Xia, Yangyang Yang, Laura Yvonne Facusse, and Bryan Reimer. 2017. To walk or not to walk: Crowdsourced assessment of external vehicle-to-pedestrian displays. arXiv preprint arXiv:1707.02698 (2017).Search in Google Scholar

[50] Vincent Frémont, Minh-Tien Phan, and Indira Thouvenin. 2019. Adaptive Visual Assistance System for Enhancing the Driver Awareness of Pedestrians. International Journal of Human–Computer Interaction 36, 9 (2019), 1–14. DOI:http://dx.doi.org/10.1080/10447318.2019.1698220.10.1080/10447318.2019.1698220Search in Google Scholar

[51] Peter Fröhlich, Raimund Schatz, Peter Leitner, Matthias Baldauf, and Stephan Mantler. 2010. Augmenting the driver’s view with realtime safety-related information. In Proceedings of the 1st Augmented Human International Conference. 1–10. DOI:http://dx.doi.org/10.1145/1785455.1785466.10.1145/1785455.1785466Search in Google Scholar

[52] Wai-Tat Fu, John Gasper, and Seong-Whan Kim. 2013. Effects of an in-car augmented reality system on improving safety of younger and older drivers. In 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 59–66. DOI:http://dx.doi.org/10.1109/ismar.2013.6671764.10.1109/ISMAR.2013.6671764Search in Google Scholar

[53] Kikuo Fujimura, Lijie Xu, Cuong Tran, Rishabh Bhandari, and Victor Ng-Thow-Hing. 2013. Driver queries using wheel-constrained finger pointing and 3-D head-up display visual feedback. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 56–62. DOI:http://dx.doi.org/10.1145/2516540.2516551.10.1145/2516540.2516551Search in Google Scholar

[54] Joseph L Gabbard, Missie Smith, Coleman Merenda, Gary Burnett, and David R Large. 2020. A Perceptual Color-Matching Method for Examining Color Blending in Augmented Reality Head-Up Display Graphics. IEEE Transactions on Visualization and Computer Graphics (2020). DOI:http://dx.doi.org/10.1109/tvcg.2020.3044715.10.1109/TVCG.2020.3044715Search in Google Scholar PubMed

[55] Jean-Michael Georg, Johannes Feiler, Frank Diermeyer, and Markus Lienkamp. 2018. Teleoperated Driving, a Key Technology for Automated Driving? Comparison of Actual Test Drives with a Head Mounted Display and Conventional Monitors. In 2018 21st International Conference on Intelligent Transportation Systems (ITSC). IEEE, 3403–3408.Search in Google Scholar

[56] Michael A Gerber, Ronald Schroeter, Li Xiaomeng, and Mohammed Elhenawy. 2020. Self-Interruptions of Non-Driving Related Tasks in Automated Vehicles: Mobile vs Head-Up Display. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–9. DOI:http://dx.doi.org/10.1145/3313831.3376751.10.1145/3313831.3376751Search in Google Scholar

[57] David Goedicke, Jamy Li, Vanessa Evers, and Wendy Ju. 2018a. Vr-oom: Virtual reality on-road driving simulation. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–11.Search in Google Scholar

[58] David Goedicke, Jamy Li, Vanessa Evers, and Wendy Ju. 2018b. VR-OOM: Virtual reality on-road driving simulation. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–11. DOI:http://dx.doi.org/10.1145/3173574.3173739.10.1145/3173574.3173739Search in Google Scholar

[59] Taehyun Ha, Sangyeon Kim, Donghak Seo, and Sangwon Lee. 2020. Effects of explanation types and perceived risk on trust in autonomous vehicles. Transportation Research Part F: Traffic Psychology and Behaviour 73 (2020), 271–280. DOI:http://dx.doi.org/10.1016/j.trf.2020.06.021.10.1016/j.trf.2020.06.021Search in Google Scholar

[60] Renate Haeuslschmid, Susanne Forster, Katharina Vierheilig, Daniel Buschek, and Andreas Butz. 2017. Recognition of Text and Shapes on a Large-Sized Head-Up Display. In Proceedings of the 2017 Conference on Designing Interactive Systems. 821–831. DOI:http://dx.doi.org/10.1145/3064663.3064736.10.1145/3064663.3064736Search in Google Scholar

[61] Evan Hanau and Voicu Popescu. 2017. MotionReader: visual acceleration cues for alleviating passenger e-reader motion sickness. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct. 72–76. DOI:http://dx.doi.org/10.1145/3131726.3131741.10.1145/3131726.3131741Search in Google Scholar

[62] Renate Häuslschmid, Max von Buelow, Bastian Pfleging, and Andreas Butz. 2017. Supportingtrust in autonomous driving. In Proceedings of the 22nd international conference on intelligent user interfaces. ACM, 319–329.Search in Google Scholar

[63] Ann-Christin Hensch, Nadine Rauh, Cornelia Schmidt, Sebastian Hergeth, Frederik Naujoks, Josef F Krems, and Andreas Keinath. 2020. Effects of secondary tasks and display position on glance behavior during partially automated driving. Transportation Research Part F: Traffic Psychology and Behaviour 68 (2020), 23–32. DOI:http://dx.doi.org/10.1016/j.trf.2019.11.014.10.1016/j.trf.2019.11.014Search in Google Scholar

[64] Lawrence J Hettinger and Gary E Riccio. 1992. Visually Induced Motion Sickness in Virtual Environments. Presence: Teleoperators and Virtual Environments 1, 3 (1992), 306–310. DOI:http://dx.doi.org/10.1162/pres.1992.1.3.306.10.1162/pres.1992.1.3.306Search in Google Scholar

[65] Eve Hoggan, Miguel Nacenta, Per Ola Kristensson, John Williamson, Antti Oulasvirta, and Anu Lehtiö. 2013. Multi-touch pinch gestures: Performance and ergonomics. In Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces. 219–222.10.1145/2512349.2512817Search in Google Scholar

[66] Kai Holländer, Ashley Colley, Christian Mai, Jonna Häkkilä, Florian Alt, and Bastian Pfleging. 2019a. Investigating the Influence of External Car Displays on Pedestrians’ Crossing Behavior in Virtual Reality. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services. 1–11. DOI:http://dx.doi.org/10.1145/3338286.3340138.10.1145/3338286.3340138Search in Google Scholar

[67] Kai Holländer, Philipp Wintersberger, and Andreas Butz. 2019b. Overtrust in External Cues of Automated Vehicles. In Proceedings of the 11th international conference on automotive user interfaces and interactive vehicular applications. 211–221. DOI:http://dx.doi.org/10.1145/3342197.3344528.10.1145/3342197.3344528Search in Google Scholar

[68] Jason Hong. 2013. Considering privacy issues in the context of Google glass. (2013).10.1145/2524713.2524717Search in Google Scholar

[69] Hong Hua and Bahram Javidi. 2015. Augmented reality: easy on the eyes. Optics and Photonics News 26, 2 (2015), 26–33.10.1364/OPN.26.2.000026Search in Google Scholar

[70] Yoonsook Hwang and Kyong-Ho Kim. 2016. Effects of the Displaying Augmented-Reality Information on the Driving Behavior of the Drivers with Specific Psychological Characteristics. In 2016 6th International Conference on IT Convergence and Security (ICITCS). IEEE, 1–2. DOI:http://dx.doi.org/10.1109/icitcs.2016.7740328.10.1109/ICITCS.2016.7740328Search in Google Scholar

[71] Yoonsook Hwang, Kyong-Ho Kim, and Hyun-Kyun Choi. 2017. Effects of the providing an augmented reality information on the cognitive reaction behavior under the reduced visibility traffic environment. In 2017 International Conference on Information and Communication Technology Convergence (ICTC). IEEE, 1031–1033. DOI:http://dx.doi.org/10.1109/ictc.2017.8190845.10.1109/ICTC.2017.8190845Search in Google Scholar

[72] Yoonsook Hwang, Byoung-Jun Park, and Kyong-Ho Kim. 2016. The Effects of Augmented-Reality Head-Up Display System Usage on Drivers? Risk Perception and Psychological Change. ETRI Journal 38, 4 (2016), 757–766. DOI:http://dx.doi.org/10.4218/etrij.16.0115.0770.10.4218/etrij.16.0115.0770Search in Google Scholar

[73] Renate Häuslschmid, Max von Bülow, Bastian Pfleging, and Andreas Butz. 2017. SupportingTrust in Autonomous Driving. In Proceedings of the 22nd international conference on intelligent user interfaces. 319–329. DOI:http://dx.doi.org/10.1145/3025171.3025198.10.1145/3025171.3025198Search in Google Scholar

[74] Renate Häuslschmid, Sven Osterwald, Marcus Lang, and Andreas Butz. 2015. Augmenting the Driver’s View with Peripheral Information on a Windshield Display. In Proceedings of the 20th International Conference on Intelligent User Interfaces. 311–321. DOI:http://dx.doi.org/10.1145/2678025.2701393.10.1145/2678025.2701393Search in Google Scholar

[75] Renate Häuslschmid, Bastian Pfleging, and Florian Alt. 2016. A Design Space to Support the Development of Windshield Applications for the Car. 5076–5091. DOI:http://dx.doi.org/10.1145/2858036.2858336.10.1145/2858036.2858336Search in Google Scholar

[76] Renate Häuslschmid, Donghao Ren, Florian Alt, Andreas Butz, and Tobias Höllerer. 2019. Personalizing Content Presentation on Large 3D Head-Up Displays. PRESENCE: Virtual and Augmented Reality 27, 1 (2019), 80–106. DOI:http://dx.doi.org/10.1162/pres_a_00315.10.1162/pres_a_00315Search in Google Scholar

[77] Renate Häuslschmid, Laura Schnurr, Julie Wagner, and Andreas Butz. 2015. Contact-analog warnings on windshield displays promote monitoring the road scene. In Proceedings of the 7th international conference on automotive user interfaces and interactive vehicular applications. 64–71. DOI:http://dx.doi.org/10.1145/2799250.2799274.10.1145/2799250.2799274Search in Google Scholar

[78] Renate Häuslschmid, Yixin Shou, John O’Donovan, Gary Burnett, and Andreas Butz. 2016. First Steps towards a View Management Concept for Large-sized Head-up Displays with Continuous Depth. In Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 1–8. DOI:http://dx.doi.org/10.1145/3003715.3005418.10.1145/3003715.3005418Search in Google Scholar

[79] Shantanu Ingle and Madhuri Phute. 2016. Tesla autopilot: semi autonomous driving, an uptick for future autonomy. International Research Journal of Engineering and Technology 3, 9 (2016), 369–372.Search in Google Scholar

[80] Grega Jakus, Christina Dicke, and Jaka Sodnik. 2014. A user study of auditory, head-up and multi-modal displays in vehicles. Applied ergonomics 46, Pt A (2014), 184–192. DOI:http://dx.doi.org/10.1016/j.apergo.2014.08.008.10.1016/j.apergo.2014.08.008Search in Google Scholar PubMed

[81] Richie Jose, Gun A Lee, and Mark Billinghurst. 2016. A comparative study of simulated augmented reality displays for vehicle navigation. In Proceedings of the 28th Australian conference on computer-human interaction. 40–48. DOI:http://dx.doi.org/10.1145/3010915.3010918.10.1145/3010915.3010918Search in Google Scholar

[82] Valerie Kane, Missie Smith, Gary Burnett, Joseph L Gabbard, and David Large. 2016. Depth perception in mirrors: The effects of video-based augmented reality in driver’s side view mirrors. In 2016 IEEE Virtual Reality (VR) (2016), 195–196. DOI:http://dx.doi.org/10.1109/vr.2016.7504720.10.1109/VR.2016.7504720Search in Google Scholar

[83] Nihan Karatas, Takahiro Tanaka, Kazuhiro Fujikakc, Yuki Yoshihara, Hitoshi Kanamori, Yoshitaka Fuwamoto, and Morihiko Yoshida. 2020. Evaluation of AR-HUD Interface During an Automated Intervention in Manual Driving. 2158–2164. DOI:http://dx.doi.org/10.1109/iv47402.2020.9304610.10.1109/IV47402.2020.9304610Search in Google Scholar

[84] I Karl, G Berg, F Ruger, and B Farber. 2013. Driving Behavior and Simulator Sickness While Driving the Vehicle in the Loop: Validation of Longitudinal Driving Behavior. IEEE Intelligent Transportation Systems Magazine 5, 1 (2013), 42–57. DOI:http://dx.doi.org/10.1109/mits.2012.2217995.10.1109/MITS.2012.2217995Search in Google Scholar

[85] Takaya Kawamata, Itaru Kitahara, Yoshinari Kameda, and Yuichi Ohta. 2013. Poster: Lifted road map view on windshield display. In 2013 IEEE Symposium on 3D User Interfaces (3DUI). IEEE, 139–140. DOI:http://dx.doi.org/10.1109/3dui.2013.6550217.10.1109/3DUI.2013.6550217Search in Google Scholar

[86] Juela Kazazi, Susann Winkler, and Mark Vollrath. 2015. Accident Prevention through Visual Warnings: How to Design Warnings in Head-up Display for Older and Younger Drivers. In 2015 IEEE 18th International Conference on Intelligent Transportation Systems. IEEE, 1028–1034. DOI:http://dx.doi.org/10.1109/itsc.2015.171.10.1109/ITSC.2015.171Search in Google Scholar

[87] Robert S Kennedy and Jennifer E Fowlkes. 1992. Simulator sickness is polygenic and polysymptomatic: Implications for research. The International Journal of Aviation Psychology 2, 1 (1992), 23–38.10.1207/s15327108ijap0201_2Search in Google Scholar

[88] Hyungil Kim, Alexandre Miranda Anon, Teruhisa Misu, Nanxiang Li, Ashish Tawari, and Kikuo Fujimura. 2016. Look at me: Augmented reality pedestrian warning system using an in-vehicle volumetric head up display. In Proceedings of the 21st International Conference on Intelligent User Interfaces. 294–298. DOI:http://dx.doi.org/10.1145/2856767.2856815.10.1145/2856767.2856815Search in Google Scholar

[89] Hyungil Kim, Joseph L Gabbard, Alexandre Miranda Anon, and Teruhisa Misu. 2018. Driver Behavior and Performance with Augmented Reality Pedestrian Collision Warning: An Outdoor User Study. IEEE transactions on visualization and computer graphics 24, 4 (2018), 1515–1524. DOI:http://dx.doi.org/10.1109/tvcg.2018.2793680.10.1109/TVCG.2018.2793680Search in Google Scholar PubMed

[90] Hyungil Kim, Xuefang Wu, Joseph L Gabbard, and Nicholas F Polys. 2013. Exploring head-up augmented reality interfaces for crash warning systems. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 224–227. DOI:http://dx.doi.org/10.1145/2516540.2516566.10.1145/2516540.2516566Search in Google Scholar

[91] Hyang Sook Kim, Sol Hee Yoon, Meen Jong Kim, and Yong Gu Ji. 2015. Deriving future user experiences in autonomous vehicle. In Adjunct Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 112–117. DOI:http://dx.doi.org/10.1145/2809730.2809734.10.1145/2809730.2809734Search in Google Scholar

[92] SeungJun Kim and Anind K Dey. 2009. Simulated augmented reality windshield display as a cognitive mapping aid for elder driver navigation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 133. DOI:http://dx.doi.org/10.1145/1518701.1518724.10.1145/1518701.1518724Search in Google Scholar

[93] Lorenz Cuno Klopfenstein, Saverio Delpriori, Silvia Malatini, and Alessandro Bogliolo. 2017. The rise of bots: A survey of conversational interfaces, patterns, and paradigms. In Proceedings of the 2017 conference on designing interactive systems. 555–565.10.1145/3064663.3064672Search in Google Scholar

[94] Moritz Körber, Eva Baseler, and Klaus Bengler. 2018. Introduction matters: Manipulating trust in automation and reliance in automated driving. Applied ergonomics 66 (2018), 18–31.10.1016/j.apergo.2017.07.006Search in Google Scholar PubMed

[95] Ann-Kathrin Kraft, Christian Maag, and Martin Baumann. 2020a. Comparing dynamic and static illustration of an HMI for cooperative driving. Accident Analysis & Prevention 144 (2020), 105682. DOI:http://dx.doi.org/10.1016/j.aap.2020.105682.10.1016/j.aap.2020.105682Search in Google Scholar PubMed

[96] Ann-Kathrin Kraft, Christian Maag, Maria Isabel Cruz, Martin Baumann, and Alexandra Neukum. 2020b. The effect of visual HMIs of a system assisting manual drivers in manoeuvre coordination in system limit and system failure situations. Transportation Research Part F: Traffic Psychology and Behaviour 74 (2020), 81–94. DOI:http://dx.doi.org/10.1016/j.trf.2020.08.002.10.1016/j.trf.2020.08.002Search in Google Scholar

[97] Ouren X Kuiper, Jelte E Bos, and Cyriel Diels. 2018. Looking forward: In-vehicle auxiliary display positioning affects carsickness. Applied Ergonomics 68 (2018), 169–175. DOI:http://dx.doi.org/10.1016/j.apergo.2017.11.002.10.1016/j.apergo.2017.11.002Search in Google Scholar PubMed

[98] Andrew L Kun. 2018. Human-Machine Interaction for Vehicles: Review and Outlook. Foundations and Trends® in Human–Computer Interaction 11, 4 (2018), 201–293.10.1561/1100000069Search in Google Scholar

[99] Andrew L Kun, Susanne Boll, and Albrecht Schmidt. 2016. Shifting gears: User interfaces in the age of autonomous driving. IEEE Pervasive Computing 15, 1 (2016), 32–38.10.1109/MPRV.2016.14Search in Google Scholar

[100] Andrew L Kun, Hidde van der Meulen, and Christian P Janssen. 2017. Calling while driving: An initial experiment with HoloLens. (2017).Search in Google Scholar

[101] Andrew L Kun, Hidde van der Meulen, and Christian P Janssen. 2019. Calling while Driving Using Augmented Reality: Blessing or Curse? PRESENCE: Virtual and Augmented Reality 27, 1 (2019), 1–14. DOI:http://dx.doi.org/10.1162/pres_a_00316.10.1162/pres_a_00316Search in Google Scholar

[102] Alexander Kunze, Stephen J Summerskill, Russell Marshall, and Ashleigh J Filtness. 2018. Augmented Reality Displays for Communicating Uncertainty Information in Automated Driving. In Proceedings of the 10th international conference on automotive user interfaces and interactive vehicular applications. 164–175. DOI:http://dx.doi.org/10.1145/3239060.3239074.10.1145/3239060.3239074Search in Google Scholar

[103] Ramesh Lagoo, Vassilis Charissis, Warren Chan, Soheeb Khan, and David Harrison. 2018. Prototype gesture recognition interface for vehicular head-up display system. In 2018 IEEE International Conference on Consumer Electronics (ICCE). 1–6. DOI:http://dx.doi.org/10.1109/icce.2018.8326146.10.1109/ICCE.2018.8326146Search in Google Scholar

[104] Ramesh Lagoo, Vassilis Charissis, and David K Harrison. 2019. Mitigating Driver’s Distraction: Automotive Head-Up Display and Gesture Recognition System. IEEE Consumer Electronics Magazine 8, 5 (2019), 79–85. DOI:http://dx.doi.org/10.1109/mce.2019.2923896.10.1109/MCE.2019.2923896Search in Google Scholar

[105] S Langlois and B Soualmi. 2016. Augmented reality versus classical HUD to take over from automated driving: An aid to smooth reactions and to anticipate maneuvers. In 2016 IEEE 19th international conference on intelligent transportation systems (ITSC). IEEE, 1571–1578. DOI:http://dx.doi.org/10.1109/itsc.2016.7795767.10.1109/ITSC.2016.7795767Search in Google Scholar

[106] Sabine Langlois, Thomas NGuyen That, and Pierre Mermillod. 2016. Virtual Head-up Displays for Augmented Reality in Cars. In Proceedings of the European Conference on Cognitive Ergonomics. 1–8. DOI:http://dx.doi.org/10.1145/2970930.2970946.10.1145/2970930.2970946Search in Google Scholar

[107] Tobias Langner, Daniel Seifert, Bennet Fischer, Daniel Goehring, Tinosch Ganjineh, and Raul Rojas. 2016. Traffic awareness driver assistance based on stereovision, eye-tracking, and head-up display. In 2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 3167–3173. DOI:http://dx.doi.org/10.1109/icra.2016.7487485.10.1109/ICRA.2016.7487485Search in Google Scholar

[108] Felix Lauber and Andreas Butz. 2013. View management for driver assistance in an HMD. In 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 1–6. DOI:http://dx.doi.org/10.1109/ismar.2013.6671828.10.1109/ISMAR.2013.6671828Search in Google Scholar

[109] Felix Lauber and Andreas Butz. 2014. In-your-face, yet unseen? improving head-stabilized warnings to reduce reaction time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 3201–3204. DOI:http://dx.doi.org/10.1145/2556288.2557063.10.1145/2556288.2557063Search in Google Scholar

[110] Felix Lauber, Claudius Böttcher, and Andreas Butz. 2014a. You’ve got the look: Visualizing infotainment shortcuts in head-mounted displays. In Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 1–8. DOI:http://dx.doi.org/10.1145/2667317.2667408.10.1145/2667317.2667408Search in Google Scholar

[111] Felix Lauber, Anna Follmann, and Andreas Butz. 2014b. What you see is what you touch: Visualizing touch screen interaction in the head-up display. In Proceedings of the 2014 conference on Designing interactive systems. 171–180. DOI:http://dx.doi.org/10.1145/2598510.2598521.10.1145/2598510.2598521Search in Google Scholar

[112] Rui Li, Yingjie Victor Chen, Linghao Zhang, Zhangfan Shen, and Zhenyu Cheryl Qian. 2020a. Effects of perception of head-up display on the driving safety of experienced and inexperienced drivers. Displays 64 (2020), 101962. DOI:http://dx.doi.org/10.1016/j.displa.2020.101962.10.1016/j.displa.2020.101962Search in Google Scholar

[113] Xiaomeng Li, Ronald Schroeter, Andry Rakotonirainy, Jonny Kuo, and Michael G Lenné. 2020b. Effects of different non-driving-related-task display modes on drivers’ eye-movement patterns during take-over in an automated vehicle. Transportation Research Part F: Traffic Psychology and Behaviour 70 (2020), 135–148. DOI:http://dx.doi.org/10.1016/j.trf.2020.03.001.10.1016/j.trf.2020.03.001Search in Google Scholar

[114] Patrick Lindemann, Tae-Young Lee, and Gerhard Rigoll. 2018a. An Explanatory Windshield Display Interface with Augmented Reality Elements for Urban Autonomous Driving. In 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE, 36–37. DOI:http://dx.doi.org/10.1109/ismar-adjunct.2018.00027.10.1109/ISMAR-Adjunct.2018.00027Search in Google Scholar

[115] Patrick Lindemann, Tae-Young Lee, and Gerhard Rigoll. 2018b. Catch My Drift: Elevating Situation Awareness for Highly Automated Driving with an Explanatory Windshield Display User Interface. Multimodal Technologies and Interaction 2, 4 (2018), 71. DOI:http://dx.doi.org/10.3390/mti2040071.10.3390/mti2040071Search in Google Scholar

[116] Patrick Lindemann, Tae-Young Lee, and Gerhard Rigoll. 2018c. Supporting Driver Situation Awareness for Autonomous Urban Driving with an Augmented-Reality Windshield Display. In 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE, 358–363. DOI:http://dx.doi.org/10.1109/ismar-adjunct.2018.00104.10.1109/ISMAR-Adjunct.2018.00104Search in Google Scholar

[117] Patrick Lindemann, Niklas Muller, and Gerhard Rigolll. 2019. Exploring the Use of Augmented Reality Interfaces for Driver Assistance in Short-Notice Takeovers. In 2019 IEEE Intelligent Vehicles Symposium (IV). IEEE, 804–809. DOI:http://dx.doi.org/10.1109/ivs.2019.8814237.10.1109/IVS.2019.8814237Search in Google Scholar

[118] Patrick Lindemann and Gerhard Rigoll. 2017a. A diminished reality simulation for driver-car interaction with transparent cockpits. In 2017 IEEE Virtual Reality (VR). IEEE, 305–306. DOI:http://dx.doi.org/10.1109/vr.2017.7892298.10.1109/VR.2017.7892298Search in Google Scholar

[119] Patrick Lindemann and Gerhard Rigoll. 2017b. Examining the Impact of See-Through Cockpits on Driving Performance in a Mixed Reality Prototype. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct. 83–87. DOI:http://dx.doi.org/10.1145/3131726.3131754.10.1145/3131726.3131754Search in Google Scholar

[120] Qiang Liu, Birte Emmermann, Oscar Suen, Bryan Grant, Jake Hercules, Erik Glaser, and Brian Lathrop. 2017. Rightward attentional bias in windshield displays: Implication towards external human machine interfaces for self-driving cars. In 2017 IEEE Conference on Cognitive and Computational Aspects of Situation Management (CogSIMA). IEEE, 1–7. DOI:http://dx.doi.org/10.1109/cogsima.2017.7929590.10.1109/COGSIMA.2017.7929590Search in Google Scholar

[121] Irene Lopatovska, Katrina Rink, Ian Knight, Kieran Raines, Kevin Cosenza, Harriet Williams, Perachya Sorsche, David Hirsch, Qi Li, and Adrianna Martinez. 2019. Talk to me: Exploring user interactions with the Amazon Alexa. Journal of Librarianship and Information Science 51, 4 (2019), 984–997.10.1177/0961000618759414Search in Google Scholar

[122] Lutz Lorenz, Philipp Kerschbaum, and Josef Schumann. 2014. Designing take over scenarios for automated driving. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 58. SAGE Publications Sage CA: Los Angeles, CA, 1681–1685. DOI:http://dx.doi.org/10.1177/1541931214581351.10.1177/1541931214581351Search in Google Scholar

[123] Guillaume Lucas, Andras Kemeny, Damien Paillot, and Florent Colombet. 2020. A simulation sickness study on a driving simulator equipped with a vibration platform. Transportation Research Part F: Traffic Psychology and Behaviour 68 (2020), 15–22. DOI:http://dx.doi.org/10.1016/j.trf.2019.11.011.10.1016/j.trf.2019.11.011Search in Google Scholar

[124] Fillia Makedon, Kathrin Konkol, Elisabeth Brandenburg, and Rainer Stark. 2020. Modular virtual reality to enable efficient user studies for autonomous driving. In Proceedings of the 13th ACM International Conference on PErvasive Technologies Related to Assistive Environments. 1–2. DOI:http://dx.doi.org/10.1145/3389189.3397647.10.1145/3389189.3397647Search in Google Scholar

[125] Philipp Maruhn, André Dietrich, Lorenz Prasch, and Sonja Schneider. 2020. Analyzing Pedestrian Behavior in Augmented Reality — Proof of Concept. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 313–321. DOI:http://dx.doi.org/10.1109/vr46266.2020.1581242905378.10.1109/VR46266.2020.00051Search in Google Scholar

[126] Mark McGill and Stephen Brewster. 2019. Virtual reality passenger experiences. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings. 434–441. DOI:http://dx.doi.org/10.1145/3349263.3351330.10.1145/3349263.3351330Search in Google Scholar

[127] Mark McGill, Alexander Ng, and Stephen Brewster. 2017. I Am The Passenger. In Proceedings of the 2017 chi conference on human factors in computing systems. 5655–5668. DOI:http://dx.doi.org/10.1145/3025453.3026046.10.1145/3025453.3026046Search in Google Scholar

[128] Mark McGill, Julie Williamson, Alexander Ng, Frank Pollick, and Stephen Brewster. 2019. Challenges in passenger use of mixed reality headsets in cars and other transportation. Virtual Reality 24 (2019), 1–21. DOI:http://dx.doi.org/10.1007/s10055-019-00420-x.10.1007/s10055-019-00420-xSearch in Google Scholar

[129] Zeljko Medenica, Andrew L Kun, Tim Paek, and Oskar Palinko. 2011a. Augmented reality vs. street views: a driving simulator study comparing two emerging navigation aids. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. 265. DOI:http://dx.doi.org/10.1145/2037373.2037414.Search in Google Scholar

[130] Zeljko Medenica, Andrew L Kun, Tim Paek, and Oskar Palinko. 2011b. Augmented reality vs. street views: a driving simulator study comparing two emerging navigation aids. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. 265–274.10.1145/2037373.2037414Search in Google Scholar

[131] Coleman Merenda, Hyungil Kim, Joseph L Gabbard, Samantha Leong, David R Large, and Gary Burnett. 2017. Did You See Me? In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 40–49. DOI:http://dx.doi.org/10.1145/3122986.3123013.10.1145/3122986.3123013Search in Google Scholar

[132] Coleman Merenda, Hyungil Kim, Kyle Tanous, Joseph L Gabbard, Blake Feichtl, Teruhisa Misu, and Chihiro Suga. 2018. Augmented Reality Interface Design Approaches for Goal-directed and Stimulus-driven Driving Tasks. IEEE Transactions on Visualization and Computer Graphics 24, 11 (2018), 2875–2885. DOI:http://dx.doi.org/10.1109/tvcg.2018.2868531.10.1109/TVCG.2018.2868531Search in Google Scholar PubMed

[133] Coleman Merenda, Missie Smith, Joseph Gabbard, Gary Burnett, and David Large. 2016. Effects of real-world backgrounds on user interface color naming and matching in automotive AR HUDs. In 2016 IEEE VR 2016 Workshop on Perceptual and Cognitive Issues in AR (PERCAR). IEEE, 1–6. DOI:http://dx.doi.org/10.1109/percar.2016.7562419.10.1109/PERCAR.2016.7562419Search in Google Scholar

[134] Coleman Merenda, Chihiro Suga, Joseph Gabbard, and Teruhisa Misu. 2019a. Effects of Vehicle Simulation Visual Fidelity on Assessing Driver Performance and Behavior. In 2019 IEEE Intelligent Vehicles Symposium (IV). IEEE, 1679–1686. DOI:http://dx.doi.org/10.1109/ivs.2019.8813863.10.1109/IVS.2019.8813863Search in Google Scholar

[135] Coleman Merenda, Chihiro Suga, Joseph L Gabbard, and Teruhisa Misu. 2019b. Effects of “Real-World” Visual Fidelity on AR Interface Assessment: A Case Study Using AR Head-up Display Graphics in Driving. In 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 145–156. DOI:http://dx.doi.org/10.1109/ismar.2019.00-10.10.1109/ISMAR.2019.00-10Search in Google Scholar

[136] Paul Milgram, Haruo Takemura, Akira Utsumi, and Fumio Kishino. 1995. Augmented reality: A class of displays on the reality-virtuality continuum. In Telemanipulator and telepresence technologies, Vol. 2351. International Society for Optics and Photonics, 282–292.10.1117/12.197321Search in Google Scholar

[137] Walter Morales-Alvarez, Oscar Sipele, Régis Léberon, Hadj Hamma Tadjine, and Cristina Olaverri-Monreal. 2020. Automated Driving: A Literature Review of the Take over Request in Conditional Automation. Electronics 9, 12 (2020), 2087.10.3390/electronics9122087Search in Google Scholar

[138] David C Morley, Grayson Lawrence, and Scott Smith. 2016. Virtual Reality User Experience as a Deterrent for Smartphone Use While Driving. In Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments. 1–3. DOI:http://dx.doi.org/10.1145/2910674.2910696.10.1145/2910674.2910696Search in Google Scholar

[139] Lia Morra, Fabrizio Lamberti, Filippo Gabriele Prattico, Salvatore La Rosa, and Paolo Montuschi. 2019. Building Trust in Autonomous Vehicles: Role of Virtual Reality Driving Simulators in HMI Design. IEEE Transactions on Vehicular Technology 68, 10 (2019), 9438–9450. DOI:http://dx.doi.org/10.1109/tvt.2019.2933601.10.1109/TVT.2019.2933601Search in Google Scholar

[140] Ghada Moussa, Essam Radwan, and Khaled Hussain. 2012. Augmented Reality Vehicle system: Left-turn maneuver study. Transportation Research Part C: Emerging Technologies 21, 1 (2012), 1–16. DOI:http://dx.doi.org/10.1016/j.trc.2011.08.005.10.1016/j.trc.2011.08.005Search in Google Scholar

[141] Sawako Nakajima, Shuichi Ino, Kazuhiko Yamashita, Mitsuru Sato, and Akio Kimura. 2009. Proposal of reduction method of Mixed Reality sickness using auditory stimuli for advanced driver assistance systems. In 2009 IEEE International Conference on Industrial Technology. IEEE, 1–5. DOI:http://dx.doi.org/10.1109/icit.2009.4939696.10.1109/ICIT.2009.4939696Search in Google Scholar

[142] Frederik Naujoks, Dennis Befelein, Katharina Wiedemann, and Alexandra Neukum. 2017. A review of non-driving-related tasks used in studies on automated driving. In International Conference on Applied Human Factors and Ergonomics. Springer, 525–537.10.1007/978-3-319-60441-1_52Search in Google Scholar

[143] Stefan Neumeier, Nicolas Gay, Clemens Dannheim, and Christian Facchi. 2018. On the way to autonomous vehicles teleoperated driving. In AmE 2018-Automotive meets Electronics; 9th GMM-Symposium. VDE, 1–6.Search in Google Scholar

[144] Victor Ng-Thow-Hing, Karlin Bark, Lee Beckwith, Cuong Tran, Rishabh Bhandari, and Srinath Sridhar. 2013. User-centered perspectives for automotive augmented reality. In 2013 IEEE International Symposium on Mixed and Augmented Reality – Arts, Media, and Humanities (ISMAR-AMH). 13–22. DOI:http://dx.doi.org/10.1109/ismar-amh.2013.6671262.10.1109/ISMAR-AMH.2013.6671262Search in Google Scholar

[145] Trung Thanh Nguyen, Kai Holländer, Marius Hoggenmueller, Callum Parker, and Martin Tomitsch. 2019. Designing for Projection-based Communication between Autonomous Vehicles and Pedestrians. In Proceedings of the 11th international conference on automotive user interfaces and interactive vehicular applications. 284–294. DOI:http://dx.doi.org/10.1145/3342197.3344543.10.1145/3342197.3344543Search in Google Scholar

[146] Donald A Norman. 2010. Natural user interfaces are not natural. interactions 17, 3 (2010), 6–10.10.1145/1744161.1744163Search in Google Scholar

[147] F Obermair, J Althaler, U Seiler, P Zeilinger, A Lechner, L Pfaffeneder, M Richter, and J Wolfartsberger. 2020. Maintenance with augmented reality remote support in comparison to paper-based instructions: Experiment and analysis. In 2020 IEEE 7th International Conference on Industrial Engineering and Applications (ICIEA). IEEE, 942–947.10.1109/ICIEA49774.2020.9102078Search in Google Scholar

[148] Haruhiko Okumura and Kazumitsu Shinohara. 2016. Human Attention and fatigue for AR Head-Up Displays. In 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct). IEEE, 306–309. DOI:http://dx.doi.org/10.1109/ismar-adjunct.2016.0102.10.1109/ISMAR-Adjunct.2016.0102Search in Google Scholar

[149] Luis Oliveira, Christopher Burns, Jacob Luton, Sumeet Iyer, and Stewart Birrell. 2020. The influence of system transparency on trust: Evaluating interfaces in a highly automated vehicle. Transportation Research Part F: Traffic Psychology and Behaviour 72 (2020), 280–296. DOI:http://dx.doi.org/10.1016/j.trf.2020.06.001.10.1016/j.trf.2020.06.001Search in Google Scholar

[150] Oskar Palinko, Andrew L Kun, Zachary Cook, Adam Downey, Aaron Lecomte, Meredith Swanson, and Tina Tomaszewski. 2013. Towards augmented reality navigation using affordable technology. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 238–241. DOI:http://dx.doi.org/10.1145/2516540.2516569.10.1145/2516540.2516569Search in Google Scholar

[151] Vandith Pamuru, Warut Khern-am nuai, and Karthik Kannan. 2021. The Impact of an Augmented-Reality Game on Local Businesses: A Study of Pokémon Go on Restaurants. Information Systems Research (2021), 1–17.10.1287/isre.2021.1004Search in Google Scholar

[152] Pablo E Paredes, Stephanie Balters, Kyle Qian, Elizabeth L Murnane, Francisco Ordóñez, Wendy Ju, and James A Landay. 2018. Driving with the Fishes: Towards calming and mindful virtual reality experiences for the car. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2, 4 (2018), 1–21. DOI:http://dx.doi.org/10.1145/3287062.10.1145/3287062Search in Google Scholar

[153] Byoung-Jun Park, Changrak Yoon, Jeong-Woo Lee, and Kyong-Ho Kim. 2015. Augmented reality based on driving situation awareness in vehicle. In 2015 17th International Conference on Advanced Communication Technology (ICACT). 593–595. DOI:http://dx.doi.org/10.1109/icact.2015.7224865.10.1109/ICACT.2015.7224865Search in Google Scholar

[154] Juneyoung Park, Mohamed Abdel-Aty, Yina Wu, and Ilaria Mattei. 2018. Enhancing In-Vehicle Driving Assistance Information Under Connected Vehicle Environment. IEEE Transactions on Intelligent Transportation Systems 20, 9 (2018), 1–10. DOI:http://dx.doi.org/10.1109/tits.2018.2878736.10.1109/TITS.2018.2878736Search in Google Scholar

[155] Daniel Perez, Mahmud Hasan, Yuzhong Shen, and Hong Yang. 2019. AR-PED: A framework of augmented reality enabled pedestrian-in-the-loop simulation. Simulation Modelling Practice and Theory 94, (2019), 237–249. DOI:http://dx.doi.org/10.1016/j.simpat.2019.03.005.10.1016/j.simpat.2019.03.005Search in Google Scholar

[156] Dan Roland Persson, Valentino Servizi, Tanja Lind Hansen, and Per Bækgaard. 2020. Using Augmented Reality to Mitigate Blind Spots in Trucks. In International Conference on Human-Computer Interaction. 379–392. DOI:http://dx.doi.org/10.1007/978-3-030-50523-3_27.10.1007/978-3-030-50523-3_27Search in Google Scholar

[157] Ingrid Pettersson, MariAnne Karlsson, and Florin Timotei Ghiurau. 2019. Virtually the Same Experience? Learning from User Experience Evaluation of In-Vehicle Systems in VR and in the Field. In Proceedings of the 2019 on Designing Interactive Systems Conference. 463–473. DOI:http://dx.doi.org/10.1145/3322276.3322288.10.1145/3322276.3322288Search in Google Scholar

[158] Lisa Pfannmüller, Martina Kramer, Bernhard Senner, and Klaus Bengler. 2015. A Comparison of Display Concepts for a Navigation System in an Automotive Contact Analog Head-up Display. Procedia Manufacturing 3 (2015), 2722–2729. DOI:http://dx.doi.org/10.1016/j.promfg.2015.07.678.10.1016/j.promfg.2015.07.678Search in Google Scholar

[159] Ken Pfeuffer, Yasmeen Abdrabou, Augusto Esteves, Radiah Rivu, Yomna Abdelrahman, Stefanie Meitner, Amr Saadi, and Florian Alt. 2021. ARtention: A design space for gaze-adaptive user interfaces in augmented reality. Computers & Graphics 95 (2021), 1–12.10.1016/j.cag.2021.01.001Search in Google Scholar

[160] Minh Tien Phan, Indira Thouvenin, and Vincent Fremont. 2016. Enhancing the driver awareness of pedestrian using augmented reality cues. In 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC). IEEE, 1298–1304. DOI:http://dx.doi.org/10.1109/itsc.2016.7795724.10.1109/ITSC.2016.7795724Search in Google Scholar

[161] Jurgen Pichen, Fei Yan, and Martin Baumann. 2020. Towards a Cooperative Driver-Vehicle Interface: Enhancing Drivers’ Perception of Cyclists through Augmented Reality. In 2020 IEEE Intelligent Vehicles Symposium (IV). IEEE, 1827–1832. DOI:http://dx.doi.org/10.1109/iv47402.2020.9304621.10.1109/IV47402.2020.9304621Search in Google Scholar

[162] Michael Plattner and Gerald Ostermayer. 2021. Undersampled Differential Phase Shift On–Off Keying for Visible Light Vehicle-to-Vehicle Communication. Applied Sciences 11, 5 (2021), 2195.10.3390/app11052195Search in Google Scholar

[163] Ioannis Politis, Stephen Brewster, and Frank Pollick. 2015. Language-based multimodal displays for the handover of control in autonomous cars. In Proceedings of the 7th international conference on automotive user interfaces and interactive vehicular applications. 3–10. DOI:http://dx.doi.org/10.1145/2799250.2799262.10.1145/2799250.2799262Search in Google Scholar

[164] Romain Pourchon, Pierre-Majorique Léger, Élise Labonté-LeMoyne, Sylvain Sénécal, François Bellavance, Marc Fredette, and François Courtemanche. 2017. Is augmented reality leading to more risky behaviors? An experiment with Pokémon Go. In International conference on HCI in business, government, and organizations. Springer, 354–361.10.1007/978-3-319-58481-2_27Search in Google Scholar

[165] Gowdham Prabhakar, Aparna Ramakrishnan, Modiksha Madan, LRD Murthy, Vinay Krishna Sharma, Sachin Deshmukh, and Pradipta Biswas. 2019. Interactive gaze and finger controlled HUD for cars. Journal on Multimodal User Interfaces 14, 1 (2019), 101–121. DOI:http://dx.doi.org/10.1007/s12193-019-00316-9.10.1007/s12193-019-00316-9Search in Google Scholar

[166] Shiguang Qiu, Xu Jing, Xiumin Fan, Qichang He, Xiumin Fan, and Qichang He. 2011. Using AR Technology for Automotive Visibility and Accessibility Assessment. In 2011 International Conference on Virtual Reality and Visualization. IEEE, 217–224. DOI:http://dx.doi.org/10.1109/icvrv.2011.26.10.1109/ICVRV.2011.26Search in Google Scholar

[167] Stanislava Rangelova and Elisabeth Andre. 2019. A Survey on Simulation Sickness in Driving Applications with Virtual Reality Head-Mounted Displays. PRESENCE: Virtual and Augmented Reality 27, 1 (2019), 15–31. DOI:http://dx.doi.org/10.1162/pres_a_00318.10.1162/pres_a_00318Search in Google Scholar

[168] Qing Rao, Tobias Tropper, Christian Grunler, Markus Hammori, and Samarjit Chakraborty. 2014. AR-IVI—Implementation of In-Vehicle Augmented Reality. In 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 3–8. DOI:http://dx.doi.org/10.1109/ismar.2014.6948402.10.1109/ISMAR.2014.6948402Search in Google Scholar

[169] Andreas Riegler, Bilal Aksoy, Andreas Riener, and Clemens Holzmann. 2020a. Gaze-based Interaction with Windshield Displays for Automated Driving: Impact of Dwell Time and Feedback Design on Task Performance and Subjective Workload. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 151–160.Search in Google Scholar

[170] Andreas Riegler, Bilal Aksoy, Andreas Riener, and Clemens Holzmann. 2020b. Gaze-based Interaction with Windshield Displays for Automated Driving: Impact of Dwell Time and Feedback Design on Task Performance and Subjective Workload. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 151–160. DOI:http://dx.doi.org/10.1145/3409120.3410654.10.1145/3409120.3410654Search in Google Scholar

[171] Andreas Riegler, Andreas Riener, and Clemens Holzmann. 2019a. Adaptive Dark Mode: Investigating Text and Transparency of Windshield Display Content for Automated Driving. In Mensch und Computer 2019. 8. DOI:http://dx.doi.org/10.18420/muc2019-ws-612.Search in Google Scholar

[172] Andreas Riegler, Andreas Riener, and Clemens Holzmann. 2019b. AutoWSD: Virtual Reality Automated Driving Simulator for Rapid HCI Prototyping. In Mensch und Computer 2019. ACM, New York, NY, USA, 5. DOI:http://dx.doi.org/10.1145/3340764.3345366.10.1145/3340764.3345366Search in Google Scholar

[173] Andreas Riegler, Andreas Riener, and Clemens Holzmann. 2019c. Towards Dynamic Positioning of Text Content on a Windshield Display for Automated Driving. In 25th ACM Symposium on Virtual Reality Software and Technology. 1–2. DOI:http://dx.doi.org/10.1145/3359996.3364757.10.1145/3359996.3364757Search in Google Scholar

[174] Andreas Riegler, Andreas Riener, and Clemens Holzmann. 2019d. Virtual reality driving simulator for user studies on automated driving. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings. 502–507. DOI:http://dx.doi.org/10.1145/3349263.3349595.10.1145/3349263.3349595Search in Google Scholar

[175] Andreas Riegler, Andreas Riener, and Clemens Holzmann. 2021a. A Systematic Review of Virtual Reality Applications for Automated Driving: 2009–2020. Frontiers in Human Dynamics (2021), 48.10.3389/fhumd.2021.689856Search in Google Scholar

[176] Andreas Riegler, Andreas Riener, and Clemens Holzmann. 2021b. Workshop on Mixed Reality Applications for In-Vehicle Experiences in Automated Driving. In Mensch und Computer 2021 – Workshopband (2021).Search in Google Scholar

[177] Andreas Riegler, Klemens Weigl, Andreas Riener, and Clemens Holzmann. 2020. StickyWSD: Investigating Content Positioning on a Windshield Display for Automated Driving. In 19th International Conference on Mobile and Ubiquitous Multimedia. 143–151. DOI:http://dx.doi.org/10.1145/3428361.3428405.10.1145/3428361.3428405Search in Google Scholar

[178] Andreas Riegler, Philipp Wintersberger, Andreas Riener, and Clemens Holzmann. 2018a. Investigating User Preferences for Windshield Displays in Automated Vehicles. In Proceedings of the 7th ACM International Symposium on Pervasive Displays (PerDis’18). ACM, New York, NY, USA, Article 8, 7 pages. DOI:http://dx.doi.org/10.1145/3205873.3205885.Search in Google Scholar

[179] Andreas Riegler, Philipp Wintersberger, Andreas Riener, and Clemens Holzmann. 2018b. Investigating User Preferences for Windshield Displays in Automated Vehicles. 1–7. DOI:http://dx.doi.org/10.1145/3205873.3205885.10.1145/3205873.3205885Search in Google Scholar

[180] Andreas Riegler, Philipp Wintersberger, Andreas Riener, and Clemens Holzmann. 2019a. Augmented Reality Windshield Displays and Their Potential to Enhance User Experience in Automated Driving. i-com 18, 2 (2019), 127–149. DOI:http://dx.doi.org/10.1515/icom-2018-0033.Search in Google Scholar

[181] Andreas Riegler, Philipp Wintersberger, Andreas Riener, and Clemens Holzmann. 2019b. Augmented Reality Windshield Displays and Their Potential to Enhance User Experience in Automated Driving. i-com 18, 2 (2019), 127–149. DOI:http://dx.doi.org/10.1515/icom-2018-0033.10.1515/icom-2018-0033Search in Google Scholar

[182] Hyocheol Ro, Jung-Hyun Byun, Yoon Jung Park, Nam Kyu Lee, and Tack-Don Han. 2019. AR pointer: Advanced ray-casting interface using laser pointer metaphor for object manipulation in 3D augmented reality environment. Applied Sciences 9, 15 (2019), 3078.10.3390/app9153078Search in Google Scholar

[183] Michelle L Rusch, Mark C Schall, Patrick Gavin, John D Lee, Jeffrey D Dawson, Shaun Vecera, and Matthew Rizzo. 2013. Directing driver attention with augmented reality cues. Transportation Research Part F: Traffic Psychology and Behaviour 16 (2013), 127–137. DOI:http://dx.doi.org/10.1016/j.trf.2012.08.007.10.1016/j.trf.2012.08.007Search in Google Scholar PubMed PubMed Central

[184] Michelle L Rusch, Mark C Schall, John D Lee, Jeffrey D Dawson, and Matthew Rizzo. 2014. Augmented reality cues to assist older drivers with gap estimation for left-turns. Accident; analysis and prevention 71 (2014), 210–221. DOI:http://dx.doi.org/10.1016/j.aap.2014.05.020.10.1016/j.aap.2014.05.020Search in Google Scholar PubMed PubMed Central

[185] M Saffarian, R Happee, D Abbink, and M Mulder. 2010. Investigating the Functionality of Different Human Machine Interfaces for Cooperative Adaptive Cruise Control. IFAC Proceedings Volumes 43, 13 (2010), 25–30. DOI:http://dx.doi.org/10.3182/20100831-4-fr-2021.00006.10.3182/20100831-4-FR-2021.00006Search in Google Scholar

[186] Yuki Sakamura, Akitoshi Tomita, Hidehiko Shishido, Tazu Mizunami, Kazuya Inoue, Yoshinari Kameda, Etsuko T Harada, and Itaru Kitahara. 2018. A Virtual Boarding System of an Autonomous Vehicle for Investigating the Effect of an AR Display on Passenger Comfort. In 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE, 344–349. DOI:http://dx.doi.org/10.1109/ismar-adjunct.2018.00101.10.1109/ISMAR-Adjunct.2018.00101Search in Google Scholar

[187] Taishi Sawabe, Masayuki Kanbara, and Norihiro Hagita. 2017. Diminished reality for acceleration stimulus: Motion sickness reduction with vection for autonomous driving. In 2017 IEEE Virtual Reality (VR). IEEE, 277–278. DOI:http://dx.doi.org/10.1109/vr.2017.7892284.10.1109/VR.2017.7892284Search in Google Scholar

[188] Tamara von Sawitzky, Philipp Wintersberger, Andreas Riener, and Joseph L Gabbard. 2019. Increasing trust in fully automated driving. In Proceedings of the 8th ACM International Symposium on Pervasive Displays. 1–7. DOI:http://dx.doi.org/10.1145/3321335.3324947.10.1145/3321335.3324947Search in Google Scholar

[189] Clemens Schartmüller, Andreas Riener, Philipp Wintersberger, and Anna-Katharina Frison. 2018. Workaholistic: on balancing typing-and handover-performance in automated driving. In Proceedings of the 20th international conference on human-computer interaction with mobile devices and services. 1–12.Search in Google Scholar

[190] Clemens Schartmüller, Andreas Riener, Philipp Wintersberger, and Anna-Katharina Frison. 2018. Workaholistic: on balancing typing-and handover-performance in automated driving. In Proceedings of the 20th international conference on human-computer interaction with mobile devices and services. 1–12. DOI:http://dx.doi.org/10.1145/3229434.3229459.10.1145/3229434.3229459Search in Google Scholar

[191] Matthias Schneider, Anna Bruder, Marc Necker, Tim Schluesener, Niels Henze, and Christian Wolff. 2019. A field study to collect expert knowledge for the development of AR HUD navigation concepts. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings. 358–362. DOI:http://dx.doi.org/10.1145/3349263.3351339.10.1145/3349263.3351339Search in Google Scholar

[192] Ronald Schroeter and Michael A Gerber. 2018. A Low-Cost VR-Based Automated Driving Simulator for Rapid Automotive UI Prototyping. In Adjunct Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 248–251. DOI:http://dx.doi.org/10.1145/3239092.3267418.10.1145/3239092.3267418Search in Google Scholar

[193] Ronald Schroeter and Fabius Steinberger. 2016. Pokémon DRIVE: towards increased situational awareness in semi-automated driving. In Proceedings of the 28th australian conference on computer-human interaction. 25–29. DOI:http://dx.doi.org/10.1145/3010915.3010973.10.1145/3010915.3010973Search in Google Scholar

[194] Felix Schwarz and Wolfgang Fastenmeier. 2017. Augmented reality warnings in vehicles: Effects of modality and specificity on effectiveness. Accident; analysis and prevention 101 (2017), 55–66. DOI:http://dx.doi.org/10.1016/j.aap.2017.01.019.10.1016/j.aap.2017.01.019Search in Google Scholar PubMed

[195] Nadja Schömig, Katharina Wiedemann, Frederik Naujoks, Alexandra Neukum, Bettina Leuchtenberg, and Thomas Vöhringer-Kuhnt. 2018. An Augmented Reality Display for Conditionally Automated Driving. In Adjunct proceedings of the 10th international conference on automotive user interfaces and interactive vehicular applications. 137–141. DOI:http://dx.doi.org/10.1145/3239092.3265956.10.1145/3239092.3265956Search in Google Scholar

[196] S Tarek Shahriar and Andrew L Kun. 2018a. Camera-View Augmented Reality. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 146–154. DOI:http://dx.doi.org/10.1145/3239060.3240447.Search in Google Scholar

[197] S Tarek Shahriar and Andrew L Kun. 2018b. Camera-View Augmented Reality: Overlaying Navigation Instructions on a Real-Time View of the Road. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. ACM, 146–154.10.1145/3239060.3240447Search in Google Scholar

[198] Hardeep Singh, Jagjit Singh Bhatia, and Jasbir Kaur. 2011. Eye tracking based driver fatigue monitoring and warning system. In India International Conference on Power Electronics 2010 (IICPE2010). IEEE, 1–6.10.1109/IICPE.2011.5728062Search in Google Scholar

[199] Missie Smith, Nadejda Doutcheva, Joseph L Gabbard, and Gary Burnett. 2015. Optical see-through HUDs effect on depth judgments of real world objects. In 2015 IEEE Virtual Reality (VR). IEEE, 285–286. DOI:http://dx.doi.org/10.1109/vr.2015.7223407.10.1109/VR.2015.7223407Search in Google Scholar

[200] Missie Smith, Joseph L Gabbard, Gary Burnett, and Nadejda Doutcheva. 2017. The Effects of Augmented Reality Head-Up Displays on Drivers’ Eye Scan Patterns, Performance, and Perceptions. International Journal of Mobile Human Computer Interaction 9, 2 (2017), 1–17. DOI:http://dx.doi.org/10.4018/ijmhci.2017040101.10.4018/IJMHCI.2017040101Search in Google Scholar

[201] Missie Smith, Joseph L Gabbard, and Christian Conley. 2016. Head-Up vs. Head-Down Displays: examining traditional methods of display assessment while driving. In Proceedings of the 8th international conference on Automotive User Interfaces and Interactive Vehicular Applications. 185–192. DOI:http://dx.doi.org/10.1145/3003715.3005419.10.1145/3003715.3005419Search in Google Scholar

[202] Missie Smith, Lisa Jordan, Kiran Bagalkotkar, Srikar Sai Manjuluri, Rishikesh Nittala, and Joseph Gabbard. 2020. Hit the Brakes!: Augmented Reality Head-up Display Impact on Driver Responses to Unexpected Events. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 46–49. DOI:http://dx.doi.org/10.1145/3409251.3411720.10.1145/3409251.3411720Search in Google Scholar

[203] Missie Smith, Jillian Streeter, Gary Burnett, and Joseph L Gabbard. 2015. Visual search tasks: the effects of head-up displays on driving and task performance. In Proceedings of the 7th international conference on Automotive User Interfaces and Interactive Vehicular Applications. 80–87. DOI:http://dx.doi.org/10.1145/2799250.2799291.10.1145/2799250.2799291Search in Google Scholar

[204] Alessandro Soro, Andry Rakotonirainy, Ronald Schroeter, and Sabine Wollstädter. 2014. Using Augmented Video to Test In-Car User Experiences of Context Analog HUDs. In Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 1–6. DOI:http://dx.doi.org/10.1145/2667239.2667302.10.1145/2667239.2667302Search in Google Scholar

[205] Daniele Sportillo, Alexis Paljic, and Luciano Ojeda. 2019. On-Road Evaluation of Autonomous Driving Training. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 182–190. DOI:http://dx.doi.org/10.1109/hri.2019.8673277.10.1109/HRI.2019.8673277Search in Google Scholar

[206] Jonathan Steuer. 1992. Defining virtual reality: Dimensions determining telepresence. Journal of communication 42, 4 (1992), 73–93.10.1111/j.1460-2466.1992.tb00812.xSearch in Google Scholar

[207] Sonja Stockert, Natalie Tara Richardson, and Markus Lienkamp. 2015. Driving in an Increasingly Automated World – Approaches to Improve the Driver-automation Interaction. Procedia Manufacturing 3 (2015), 2889–2896. DOI:http://dx.doi.org/10.1016/j.promfg.2015.07.797.10.1016/j.promfg.2015.07.797Search in Google Scholar

[208] José A Sánchez, Laura Pozueco, Xabiel G Pañeda, Alejandro G Tuero, David Melendi, and Roberto García. 2016. Incorporation of Head-Up Display Devices in Real-Vehicular Environments to Improve Efficiency in Driving. In Proceedings of the XVII International Conference on Human Computer Interaction. 1–2. DOI:http://dx.doi.org/10.1145/2998626.2998637.10.1145/2998626.2998637Search in Google Scholar

[209] K Tangmanee and S Teeravarunyou. 2012. Effects of guided arrows on head-up display towards the vehicle windshield. In 2012 Southeast Asian Network of Ergonomics Societies Conference (SEANES) 1–6. DOI:http://dx.doi.org/10.1109/seanes.2012.6299572.10.1109/SEANES.2012.6299572Search in Google Scholar

[210] Kathryn G Tippey, Elayaraj Sivaraj, and Thomas K Ferris. 2017. Driving While Interacting With Google Glass: Investigating the Combined Effect of Head-Up Display and Hands-Free Input on Driving Safety and Multitask Performance. Human Factors: The Journal of the Human Factors and Ergonomics Society 59, 4 (2017), 671–688. DOI:http://dx.doi.org/10.1177/0018720817691406.10.1177/0018720817691406Search in Google Scholar PubMed

[211] Yourui Tong and Bochen Jia. 2019. An Augmented-reality-based Warning Interface for Pedestrians: User Interface Design and Evaluation. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 63, 1 (2019), 1834–1838. DOI:http://dx.doi.org/10.1177/1071181319631413.10.1177/1071181319631413Search in Google Scholar

[212] Bethan Hannah Topliss, Catherine Harvey, and Gary Burnett. 2020. How Long Can a Driver Look? Exploring Time Thresholds to Evaluate Head-up Display Imagery. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 9–18. DOI:http://dx.doi.org/10.1145/3409120.3410663.10.1145/3409120.3410663Search in Google Scholar

[213] Bethan Hannah Topliss, Sanna M Pampel, Gary Burnett, Lee Skrypchuk, and Chrisminder Hare. 2018. Establishing the Role of a Virtual Lead Vehicle as a Novel Augmented Reality Navigational Aid. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 137–145. DOI:http://dx.doi.org/10.1145/3239060.3239069.10.1145/3239060.3239069Search in Google Scholar

[214] Cuong Tran, Karlin Bark, and Victor Ng-Thow-Hing. 2013. A left-turn driving aid using projected oncoming vehicle paths with augmented reality. In Proceedings of the 5th international conference on automotive user interfaces and interactive vehicular applications. 300–307. DOI:http://dx.doi.org/10.1145/2516540.2516581.10.1145/2516540.2516581Search in Google Scholar

[215] Akira Utsumi, Nao Kawanishi, Isamu Nagasawa, Kenichi Satou, Kyouhei Uchikata, and Norihiro Hagita. 2018. Impact of Directive Visual Information on Driver’s Emergency Behavior. In 2018 21st International Conference on Intelligent Transportation Systems (ITSC). IEEE, 1344–1349. DOI:http://dx.doi.org/10.1109/itsc.2018.8569892.10.1109/ITSC.2018.8569892Search in Google Scholar

[216] Akira Utsumi, Tsukasa Mikuni, and Isamu Nagasawa. 2019. Effect of On-Road Virtual Visual References on Vehicle Control Stability of Wide/Narrow FOV Drivers. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings (AutomotiveUI’19). Association for Computing Machinery, New York, NY, USA, 297–301. DOI:http://dx.doi.org/10.1145/3349263.3351331.10.1145/3349263.3351331Search in Google Scholar

[217] Gabriela Villalobos-Zúñiga, Tuomo Kujala, and Antti Oulasvirta. 2016. T9+HUD. In Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 177–184. DOI:http://dx.doi.org/10.1145/3003715.3005453.10.1145/3003715.3005453Search in Google Scholar

[218] Tamara von Sawitzky, Thomas Grauschopf, and Andreas Riener. 2020a. No Need to Slow Down! A Head-up Display Based Warning System for Cyclists for Safe Passage of Parked Vehicles. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 1–3.10.1145/3409251.3411708Search in Google Scholar

[219] Tamara von Sawitzky, Philipp Wintersberger, Andreas Löcken, Anna-Katharina Frison, and Andreas Riener. 2020b. Augmentation Concepts with HUDs for Cyclists to Improve Road Safety in Shared Spaces. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. 1–9.10.1145/3334480.3383022Search in Google Scholar

[220] Jingyao Wang, Jing Chen, Yuanyuan Qiao, Junyan Zhou, and Yongtian Wang. 2019b. Online Gesture Recognition Algorithm Applied to HUD Based Smart Driving System. In 2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE, 289–294. DOI:http://dx.doi.org/10.1109/ismar-adjunct.2019.00-26.10.1109/ISMAR-Adjunct.2019.00-26Search in Google Scholar

[221] Jiao Wang and Dirk Soffker. 2016. Improving driving efficiency for hybrid electric vehicle with suitable interface. In 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE, 000928–000933. DOI:http://dx.doi.org/10.1109/smc.2016.7844360.10.1109/SMC.2016.7844360Search in Google Scholar

[222] Shu Wang, Vassilis Charissis, and David K Harisson. 2017. Augmented Reality Prototype HUD for Passenger Infotainment in a Vehicular Environment. Advances in Science, Technology and Engineering Systems Journal 2, 3 (2017), 634–641. DOI:http://dx.doi.org/10.25046/aj020381.10.25046/aj020381Search in Google Scholar

[223] S Wang, V Charissis, R Lagoo, J Campbell, and DK Harrison. 2019a. Reducing Driver Distraction by Utilizing Augmented Reality Head-Up Display System for Rear Passengers. In 2019 IEEE International Conference on Consumer Electronics (ICCE). IEEE, 1–6. DOI:http://dx.doi.org/10.1109/icce.2019.8661927.10.1109/ICCE.2019.8661927Search in Google Scholar

[224] Ying Wang, Shengfan He, Zuerhumuer Mohedan, Yueyan Zhu, Lijun Jiang, and Zhelin Li. 2014. Design and evaluation of a steering wheel-mount speech interface for drivers’ mobile use in car. In 17th International IEEE Conference on Intelligent Transportation Systems (ITSC). IEEE, 673–678. DOI:http://dx.doi.org/10.1109/itsc.2014.6957767.10.1109/ITSC.2014.6957767Search in Google Scholar

[225] Florian Weidner and Wolfgang Broll. 2019. Interact with your car: a user-elicited gesture set to inform future in-car user interfaces. In Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia. 1–12. DOI:http://dx.doi.org/10.1145/3365610.3365625.10.1145/3365610.3365625Search in Google Scholar

[226] Garrett Weinberg, Bret Harsham, and Zeljko Medenica. 2011a. Evaluating the usability of a head-up display for selection from choice lists in cars. In Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 39–46. DOI:http://dx.doi.org/10.1145/2381416.2381423.10.1145/2381416.2381423Search in Google Scholar

[227] Garrett Weinberg, Bret Harsham, and Zeljko Medenica. 2011b. Investigating HUDs for the Presentation of Choice Lists in Car Navigation Systems. (2011), 195–202. DOI:http://dx.doi.org/10.17077/drivingassessment.1397.10.17077/drivingassessment.1397Search in Google Scholar

[228] Gesa Wiegand, Christian Mai, Kai Holländer, and Heinrich Hussmann. 2019. InCarAR. (2019), 1–13. DOI:http://dx.doi.org/10.1145/3342197.3344539.10.1145/3342197.3344539Search in Google Scholar

[229] Christian A Wiesner, Mike Ruf, Demet Sirim, and Gudrun Klinker. 2017. 3D-FRC: Depiction of the future road course in the Head-Up-Display. In 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 136–143. DOI:http://dx.doi.org/10.1109/ismar.2017.30.10.1109/ISMAR.2017.30Search in Google Scholar

[230] Benjamin Williams, Alexandra E Garton, and Christopher J Headleand. 2020. Exploring Visuo-haptic Feedback Congruency in Virtual Reality. In 2020 International Conference on Cyberworlds (CW). IEEE, 102–109. DOI:http://dx.doi.org/10.1109/cw49994.2020.00022.10.1109/CW49994.2020.00022Search in Google Scholar

[231] Susann Winkler, Juela Kazazi, and Mark Vollrath. 2015. Distractive or Supportive – How Warnings in the Head-up Display Affect Drivers’ Gaze and Driving Behavior. In 2015 IEEE 18th International Conference on Intelligent Transportation Systems. IEEE, 1035–1040. DOI:http://dx.doi.org/10.1109/itsc.2015.172.10.1109/ITSC.2015.172Search in Google Scholar

[232] Susann Winkler, Juela Kazazi, and Mark Vollrath. 2018. How to warn drivers in various safety-critical situations – Different strategies, different reactions. Accident Analysis & Prevention 117, (2018), 410–426. DOI:http://dx.doi.org/10.1016/j.aap.2018.01.040.10.1016/j.aap.2018.01.040Search in Google Scholar PubMed

[233] Philipp Wintersberger, Anna-Katharina Frison, Andreas Riener, and Tamara von Sawitzky. 2019a. Fostering User Acceptance and Trust in Fully Automated Vehicles: Evaluating the Potential of Augmented Reality. PRESENCE: Virtual and Augmented Reality 27, 1 (2019), 46–62. DOI:http://dx.doi.org/10.1162/pres_a_00320.Search in Google Scholar

[234] Philipp Wintersberger, Anna-Katharina Frison, Andreas Riener, and Tamara von Sawitzky. 2019b. Fostering User Acceptance and Trust in Fully Automated Vehicles: Evaluating the Potential of Augmented Reality. PRESENCE: Virtual and Augmented Reality 27, 1 (2019), 46–62. DOI:http://dx.doi.org/10.1162/pres_a_00320.10.1162/pres_a_00320Search in Google Scholar

[235] Philipp Wintersberger, Hannah Nicklas, Thomas Martlbauer, Stephan Hammer, and Andreas Riener. 2020. Explainable Automation: Personalized and Adaptive UIs to Foster Trust and Understanding of Driving Automation Systems. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 252–261. DOI:http://dx.doi.org/10.1145/3409120.3410659.10.1145/3409120.3410659Search in Google Scholar

[236] Philipp Wintersberger, Tamara von Sawitzky, Anna-Katharina Frison, and Andreas Riener. 2017a. Traffic Augmentation as a Means to Increase Trust in Automated Driving Systems. In Proceedings of the 12th biannual conference on italian sigchi chapter. 1–7. DOI:http://dx.doi.org/10.1145/3125571.3125600.Search in Google Scholar

[237] Philipp Wintersberger, Tamara von Sawitzky, Anna-Katharina Frison, and Andreas Riener. 2017b. Traffic Augmentation as a Means to Increase Trust in Automated Driving Systems. In Proceedings of the 12th Biannual Conference on Italian SIGCHI Chapter. ACM, 17.10.1145/3125571.3125600Search in Google Scholar

[238] Xingwei Wu, Coleman Merenda, Teruhisa Misu, Kyle Tanous, Chihiro Suga, and Joseph L Gabbard. 2020. Drivers’ Attitudes and Perceptions towards A Driving Automation System with Augmented Reality Human-Machine Interfaces. In 2020 IEEE Intelligent Vehicles Symposium (IV). IEEE, 1978–1983. DOI:http://dx.doi.org/10.1109/iv47402.2020.9304717.10.1109/IV47402.2020.9304717Search in Google Scholar

[239] Yina Wu, Mohamed Abdel-Aty, Juneyoung Park, and Jiazheng Zhu. 2018. Effects of crash warning systems on rear-end crash avoidance behavior under fog conditions. Transportation Research Part C: Emerging Technologies 95 (2018), 481–492. DOI:http://dx.doi.org/10.1016/j.trc.2018.08.001.10.1016/j.trc.2018.08.001Search in Google Scholar

[240] Felix Wulf, Maria Rimini-Doring, Marc Arnon, and Frank Gauterin. 2015. Recommendations Supporting Situation Awareness in Partially Automated Driver Assistance Systems. IEEE Transactions on Intelligent Transportation Systems 16, 4 (2015), 2290–2296. DOI:http://dx.doi.org/10.1109/tits.2014.2376572.10.1109/TITS.2014.2376572Search in Google Scholar

[241] Bo Yang, Rencheng Zheng, Keisuke Shimono, Tsutomu Kaizuka, and Kimihiko Nakano. 2017. Evaluation of the effects of in-vehicle traffic lights on driving performances for unsignalised intersections. IET Intelligent Transport Systems 11, 2 (2017), 76–83. DOI:http://dx.doi.org/10.1049/iet-its.2016.0084.10.1049/iet-its.2016.0084Search in Google Scholar

[242] Chung Kee Yeh, Yei Po Fang, Kuang Yi Shih, and Maxwell Jiang. 2013. Ergonomic Analysis of the Automotive Head-Up Displayed Information. Advanced Engineering Forum 10 (2013), 327–330. DOI:http://dx.doi.org/10.4028/www.scientific.net/aef.10.327.10.4028/www.scientific.net/AEF.10.327Search in Google Scholar

[243] Yuandong Yin, Rencheng Zheng, Kimihiko Nakano, Bo Yang, and Shigeyuki Yamabe. 2016. Analysis of influence on driver behaviour while using in-vehicle traffic lights with application of head-up display. IET Intelligent Transport Systems 10, 5 (2016), 347–353. DOI:http://dx.doi.org/10.1049/iet-its.2015.0179.10.1049/iet-its.2015.0179Search in Google Scholar

[244] Ali Ozgur Yontem, Kun Li, Daping Chu, Valerian Meijering, and Lee Skrypchuk. 2020. Prospective immersive human-machine interface for future vehicles: Multiple Zones Turn the Full Windscreen Into a Head-Up Display. IEEE Vehicular Technology Magazine 16, 1 (2020), 83–92.10.1109/MVT.2020.3013832Search in Google Scholar

[245] Markus Zimmermann, David Schopf, Niklas Lütteken, Zhengzhenni Liu, Konrad Storost, Martin Baumann, Riender Happee, and Klaus J Bengler. 2018. Carrot and stick: A game-theoretic approach to motivate cooperative driving through social interaction. Transportation Research Part C: Emerging Technologies 88, (2018), 159–175. DOI:http://dx.doi.org/10.1016/j.trc.2018.01.017.10.1016/j.trc.2018.01.017Search in Google Scholar

[246] Stefanie Zollmann, Gerhard Schall, Sebastian Junghanns, and Gerhard Reitmayr. 2012. Comprehensible and interactive visualizations of GIS data in augmented reality. In International Symposium on Visual Computing. Springer, 675–685.10.1007/978-3-642-33179-4_64Search in Google Scholar

Published Online: 2021-11-27
Published in Print: 2021-12-20

© 2021 Riegler et al., published by De Gruyter

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 4.3.2024 from https://www.degruyter.com/document/doi/10.1515/icom-2021-0029/html
Scroll to top button