According to a survey performed by the World Health Organization in October 2013, the number of people suffering from a visual disability is approximately 285 million, of which 39 million are legally blind. For these people, navigating and orienting in unknown environments represents a difficult task. As the technology advances and the number and quality of assistive electronic travel aids increases, there is a need for an effective training method that could improve the spatial cognitive representation of blind people to enable them to travel safely and independently in real-life environments. The development of navigation skills is based on the alternate remaining sensory modalities, such as hearing and touch. As tactile sense is limited to the perception of nearby objects, hearing is a more powerful sensory channel (due to the large capacity of perceptual input) in the ability to recreate the spatial relationship between a blind individual and their surrounding objects. As it is generally believed that, in the case of blind people, the deprivation of sight is compensated for by a behavioral improvement of the remaining senses, audio-based computer games prove to be an efficient rehabilitative technique for training and testing the navigation skills of visually impaired subjects within an immersive and user-centered virtual reality approach.
Computer games play an important role in the development of mental structures as they enhance learning, improve creativity and problem-solving skills, stimulate motivation and resolve communication issues. Unfortunately, due to their obvious physical and cognitive limitations and because video games rely primarily on graphical information, visually impaired people are unable to play most of the computer games available on the market today. To overcome this drawback, audio games contain a large amount of sound content and build an entire user experience upon an auditory interface alone or on an auditory and graphical interface. Navigational audio-based games enhance the development of spatial cognitive skills and help blind people access and manipulate environmental information by stimulating contextual learning. Moreover, contextual learning can be successfully transferred to the real world, raising sensory awareness, promoting searching skills and improving localization ability in unknown environments .
Some research studies have demonstrated that early blind people possess equal or higher localization abilities than sighted individuals. Moreover, they respond to monaural stimuli better than sighted subjects. In addition to this, participants with residual vision localized sounds less accurately than sighted or completely blind individuals, demonstrating that the compensatory strategies emerge according to the level of blindness. These findings suggest that early and completely blind people are able to process auditory spatial information more effectively than late blind or sighted individuals due to their advanced brain cross-modal plasticity and sensory substitution enhancement .
Training modalities for the development of orientation and mobility skills
Orientation and mobility are the main issues that need to be resolved in order to improve the quality of life of visually impaired people. Traditional training modalities, such as guide dogs and white canes, have proved their efficiency for path navigation and obstacle detection. The use of the white cane as a training and way-finding tool depends on the orientation skills of the blind individual and on their prior spatial knowledge. Unfortunately, the visually impaired are reluctant to use the white cane extensively because they fear injury and because the environmental infrastructure and design does not allow for accessible navigation in unfamiliar, psychologically demanding or crowded spaces. Guide dogs offer assistance, enhance dynamic independence and mobility and alert blind users when dangerous objects or situations emerge. Over the years, the assistive technological advancement has led to the development of electronic travel aids (ETA). The most well-known devices are the electronic and laser canes, the ultrasonic systems based on echolocation and the modern tools that rely on image acquisition, 3D scene reconstruction and auditory rendering through 3D binaural sounds.
Audio games, however, are a reliable training modality that improves orientation and mobility through an accessible, pleasant and entertaining approach that is suitable for the requirements of the visually impaired. The work of Merabet and Sanchez  has demonstrated that the ludic-based approach to learning to navigate in unfamiliar settings helps the players build a reliable spatial cognitive map of the environment and transfer knowledge into real-world situations more efficiently than other training techniques.
Audio games for blind people
Virtual reality and audio games
The use of virtual-reality-based environments helps blind people develop strong spatial representation skills as it enables them to gather, manipulate and transfer the acquired information from the virtual to the real world . In addition to this, immersive simulators have proved to be useful for the enhancement of contextual learning and for the development of navigation and orientation abilities. The purpose of auditory-based virtual environments is to guide visually impaired subjects in building solid spatial cognitive maps that can help them to navigate safely in day-to-day situations. Various complex audio cues are assigned to the constitutive elements of the environment in order to describe and emphasize their characteristics. For instance, verbal notifications are used to give assistive information regarding the current location of the player in the context of the game environment or of required tasks, while non-verbal spectral cues provide indications on the meta-level of knowledge concerning the direction and distance of target objects. In audio-based games, sound information is perceived continuously and oriented contextually within an engaging and dynamic interface, the purpose of which is not only to provide an enthralling user experience, but also to build spatial cognitive maps, develop alternate sensory capabilities and improve orientation, mobility and navigation skills.
As Lokki and Gröhn demonstrated in one of their experiments , audio cues have proved to be equally as accurate as visual information for navigating in unknown environments. In fact, the audio-based localization of the objects encountered during the first stages of the experiment was significantly superior to the localization based on visual cues. Nonetheless, the audio signals employed in virtual environments must be thoroughly devised to convey the desired perception for an effective spatial navigation.
The sonification process involves the assignment of audio cues to the most important environmental elements, objects and events. The sound characteristics, referred to as “auditory dimensions”, are able to convey relevant information about the setting. To completely describe a virtual environment, the association of various auditory dimensions (amplitude, pitch) for the representation of multidimensional data is strongly required, as they highly interact and influence each other. For instance, the variation of amplitude (or loudness) is used to encode the distance to the target objects, whereas the change in pitch emphasizes elevation variation. The elements that are assigned to audio cues describe the virtual settings (spatial size, environmental components, color, distance, elevation, direction, exits) or help the player to pursue the challenge of the game by pointing out dangerous characters, events or situations (Table 1).
In , the number of simultaneous sound sources is limited. The sound is closed between the walls so that the player cannot hear sounds coming from nearby rooms. The majority of audio games try to reduce the number of concurrent sounds and ornamental audio to avoid navigation difficulties. Tim’s Journey , on the contrary, uses a multitude of simultaneous sounds that differ in importance, character and intensity.
Desirable characteristics of audio games
Blind people should perceive audio games as a toy – an easy and interactive method of learning and a source of entertainment at the same time. To be accessible and appealing to visually impaired people, audio games should fulfill the following specifications:
A high level of immersion and attractiveness to motivate the user to advance in the game;
Diversity in audio clues;
A small amount or a complete lack of “game over” situations that discourage the user from continuing to play the game;
A single, clear way of resolving the issues raised by the scenario;
An intuitive learning strategy that is integrated into the outline of the game so that visually impaired people understand the requirements without the use of human or text-to-speech instructions;
Interactivity, engagement and fun so that blind people benefit from an enjoyable and entertaining experience .
WACAG 2.0 (Web Content Accessibility Guidelines) laid the foundation for the development of blind-accessible audio games for the Web. The main principles on which the content and the interface design should be based are the following:
Perception – the information should be presented to the users in an accessible manner for them to perceive and understand;
Operability – the users should be able to interact and communicate with the interface;
Content comprehension – interface operation and manipulation should be easily accessible;
Robustness – The game should remain accessible even as the technology advances.
Our research has shown that the number of browser-based, Web 2.0 audio games is limited. Some relevant examples are Grail to the Thief (based on computer-synthesized speech, voiceovers, sound effects and screen readers), Sryth (a browser-based fantasy text role-playing game), Legend of the Green Dragon and a Harry Potter role-playing game for the visually impaired. However, there are no audio games available on popular socialization platforms, such as Facebook.
The majority of blind-accessible games are played from the first-person perspective to increase realism, flow, immersion and the sense of presence [shooters, adventure games, Blindfold, role-playing games, Sryth (a browser-based fantasy text role-playing game), Materia Magica Fantasy Role Playing Game, Funny Bowling, etc.].
Of blind-accessible game platforms with socialization functionalities, one notable example is Zone BBS , a community portal of around 7000 blind users that offers 15 online and multiplayer games, as well as socializing tools (forum, chat, blog). The standard membership is free and offers games, chat, message boards (forum), user profiles and private mail between members. They also offer additional features for premium members (audio profiles, more games, ability to create public or private channel on voice chat, etc.).
In addition, visually impaired people can find multiplayer games on PCS Games. This portal offers several attractive multiplayer games: RS Games Yahtzee, Uno and Monopoly, which use the RS Games client and allow players from all over the world via the RS centralized server, Home of Swamp (the first multiplayer online shooting game), Rail Racer, Materia Magica Fantasy Role Playing Game and Funny Bowling .
Advantages of audio games
Audio games develop contextual learning, situational knowledge, raise awareness and improve response times to the scenario’s challenges. Also, games play a significant role in the formation of behavioral abilities and mental structures by enhancing interactivity and problem-solving skills, by improving creativity and imagination and by stimulating learning. In addition to this, navigational audio games promote a solid mental representation of the environment, a higher level of independence and spatial reasoning skills, robust exploratory abilities and strong searching and way-finding capabilities [11–13].
The results of the experiment described in  demonstrate that audio game performance is significantly transferred to real-life navigation situations. These findings are remarkable, especially because good results have been observed in the case of people who were not familiar with the real-world environments or who have never undergone previous orientation and mobility training. As a result, audio navigational games serve not only as a means of training the spatial representation skills of visually impaired people, but also as an effective way of translating the acquired abilities into the fulfillment of real-world exploratory goals.
Flow in audio games
Flow, as it described in , represents the feeling of “happiness, creativity, subjective well-being and fun” that people encounter when they are involved in an activity that brings them a high level of excitement. Nowadays, video games generate flow experiences by promoting positive feelings, entertainment, immersion, a maximization of performance and determination to accomplish a final goal. To be appropriate for blind people, audio games should draw a balance between the user’s abilities and the scenario’s challenges. Moreover, they should fulfill the following six requirements identified by Csikszentmihalyi, the psychologist who identified the concept of Flow in the mid-1970s :
A demanding activity that intensively requires the abilities of the player;
Well-defined, comprehensible goals;
Concentration on the current assignment of the game;
A sensation that the player lost track of time while being immersed in the game;
A feeling of being in control of the game’s outcome at each moment of time;
Immersion in the action of the game and, concurrently, awareness of the changes that take place at each moment of time.
The sonification approach in navigational audio games
Sound is a meaningful channel to communicate information and emotion, almost as expressive as graphics. Audio signals can be successfully employed in games because they are easy to develop, encourage user displacement and flexibility and improve immersion and interactivity without requiring expensive equipment (just a pair of stereophonic headphones and an additional input device, such as a keyboard or a controller). With respect to audio games, they can be entirely audio, without any graphical interfaces or displays. Their purpose is to deliver a vivid and dynamic experience to the player, improving his understanding of the scenario in a deep and meaningful way . For sound developers, it is essential to balance the requirements of aesthetics and functionality and to provide a “high level of complexity and open-endedness”  by employing the transfer of knowledge on a “meta-level” of perception.
The sonification strategies employed in audio games make use of several auditory elements for establishing a communication interface with the user:
Auditory icons, which are simple, easily recognizable sounds that “describe objects, functions and actions” . They merge the previous knowledge of the listener with “natural auditory associations with sound sources and causes” . Auditory icons can incorporate a large variety of information simultaneously by combining and processing the audio dimensions of sound amplitude, pitch and timbre and by assigning a meaning to the corresponding objects and situations from the virtual environment.
Earcons, which are short pieces of musical phrases that serve as non-verbal audio messages for the player. They have an explanatory and descriptive character, creating a symbolic mapping between sound and information. For example, earcons can be used to raise awareness regarding the current location of the player or the state of the game, to indicate a change in the course of the plot or to add contextual meaning to the menu in the user interface.
Speech, or verbal recordings (usually based on text-to-speech processors) that describe an object or a situation. Using speech in audio games generates important perception problems due to the large amount of resources and working memory needed to play the game and to understand the spoken message at the same time.
Auditory memory refers to the brain capacity to learn, comprehend, differentiate and recognize the sounds. An additional drawback is language dependence, as blind users have difficulties in understanding the text voiceover, especially if it is spoken in a language other than their mother tongue.
The 3D rendering in all of the games is based on stereo-panning, conveyed through stereophonic headphones. The games discussed do not employ head related transfer function-based 3D sound. Playback on headphones is superior to loudspeakers and undoubtedly preferred by blind players, as they perceive the sound more accurately without losing salient directional information.
According to Michael Chion, there are three types of human listening that have been largely employed in the development of sonification maps for audio games:
Casual listening, where the player listens to the sound, trying to determine its source, origin and characteristics;
Semantic listening, a process that involves understanding the meaning of speech sound sequences;
Reduced listening, a type of listening in which the focus is concentrated on perceiving the qualities of sound (amplitude, pitch or timbre), rather than considering the emitting source location or directional information.
The audio approach presented in  identifies and uses the following types of sounds:
Avatar sounds, which describe the sounds produced by the actions of the avatar;
Character sounds produced by non-player characters;
Ornamental sounds that act as ambient audio and are designed to make the gameplay more attractive;
Instructions or speech cues that give information regarding the current position of the user, the state of the game and the tasks that need to be accomplished.
All three types of listening can be used simultaneously in audio games to fulfill the requirements of functionality and aesthetics. Audio games can contain authentic, “natural-sounding” signals or “cartoonised”, altered soundscapes derived from original sounds that augment a certain context or situation . Sound cartoonification caricatures certain aspects of a sound event and simplifies the audio model, while retaining perceptual invariants and discarding other irrelevant aspects.
In audio games, the sound sequences are looped and layered in a rhythmical arrangement over time, the purpose of which is to generate a realistic perception by promoting casual and reduced listening. Game developers can extend the significance of objects by incorporating some layers of information in the sound cues to enhance contextual perception at the meta-level of understanding. The sound cues that are assigned to the objects from the virtual environment are not heard separately, but within the context of the game scenario, this is why they should be easy to identify and discern. There are two strategies that can be used to reproduce the presence of objects in auditory-based games. The first implies that all sounds are being heard continuously, a fact that makes some audio cues become undistinguishable and, second, a more natural approach featuring the emission of brief sounds associated with some form of interaction with the associated virtual objects and situations. Although the second approach is less confusing, it lacks a sense of overall perception and spatiality that is necessary to give the player a general perspective over the setting. The sound design procedure presented in  represents a novel approach that is somehow in-between the two strategies described. Thus, it encourages the player to explore the environment gradually, while listening to repetitive soundtrack patterns in order to get familiar with the game’s objects and situations.
Benefits of navigational audio games for blind people
Improving cognitive spatial maps
As recent studies have demonstrated, the use of audio-based exploration games in the case of visually-impaired people encourage the development of spatial cognitive maps . In addition to this, the results of the experiment described in  prove that the overall performance of blind subjects is not related to the level of orientation and mobility experience that they already possess. Audio games can employ an allocentric or an egocentric frame of reference as a strategy method for navigation. The allocentric frame of reference emphasizes the characteristics of the surrounding objects and does not depend on the point of view of the player. The egocentric approach is more subjective and user-centered, helping blind people to acquire specific knowledge from a first-person perspective. Moreover, the same experiment demonstrated that directional information conveyed by using audio spectral cues at the meta-level of perception simplifies the processing of cognitive tasks and reduces the load of memory necessary for performing navigation. Hence, a game exploration metaphor that relies on the use of spatial sounds helps blind people to develop a strong, solid and reliable spatial cognitive representation map that can support independence and safe navigation. The dynamic approach to an active navigation and discovery of the virtual environment enhances the relationship between the player and the surrounding objects. As a result, non-visual subjects can benefit not only from the improvement and consolidation of contextual learning, but also from an advanced transfer of capabilities into the real world of way-finding, searching skills, imagination, confidence and abstract thinking .
The development of compensatory strategies
It is currently a widely accepted belief that blind or visually impaired people benefit from compensatory alterations of the remaining senses, not only in their ability to perform orientation and navigation skills, but also in the performance of the corresponding regions of the brain. For instance, the compensatory changes affect both areas that control the functional sensory channels, such as hearing and touch and the visual cortex area. This process of neuroplasticity is encountered also in the brain regions that control navigation performance, spatial representation and memory (for example, an increase of activity in the hippocampus and in the parietal regions has been observed), in the sensory-motor area and in the frontal region that is responsible for decision-making activities . A good example to support these findings is the behavioral adaptation of taxi drivers in London. To learn the complex map of London, cab drivers undergo a 2-year training procedure designed to teach them to navigate the complicated layout of streets and avenues in the big European city. When they were required to take part in brain-scanning research, scientists determined that the hippocampus (the area responsible for localization memory and spatial task resolution) is far more enlarged than in normal subjects . As a result, we can conclude that performing spatial imagery, search and route-finding tasks represents a reliable approach for the improvement of navigation and orientation capabilities.
A brief review of the most significant navigational audio games
Auditory-based environment simulator (AbES)
AbES [3, 14, 18] is an audio game developed by a team of researchers lead by Dr. Lotfi Merabet, a clinical researcher at the Massachusetts Eye and Ear Infirmary in Boston and Dr. Jaime Sanchez, from the University of Chile. AbES simulates navigation inside a virtual building, starting from the premise that the acquired contextual spatial learning can be transferred into the real-world exploration.
Virtual environment Virtual rendering of a physical building (two-story building with 23 rooms, 2 stairwells and 3 exits) that contains structural elements such as walls, doors, rooms, corridors and other furniture objects like doors, desks or tables.
Game scenario The purpose of the game is to search for randomly hidden jewels and to bring them out of the building as quickly as possible while avoiding monsters that try to take and hide them in another place.
User interface The audio interface helps the user to identify his position, orientation and heading, as well as the location of nearby objects. The graphical interface gives clues about the layout of the building, the current state of the game and the position of the objects and the avatar.
User interaction Keyboard interaction: space bar (go forward), H key (turn left), K key (turn right), J key (action, for example, open a door) and F key (help key, information about the current position of the player).
Sonification approach 3D naturalistic and comprehensible spatialized sounds, auditory icons, earcons, text-to-speech soundscapes. Audio verbal commands, orientation (north, south, east and west), location and heading are based on TTS commands. Distance is encoded by sound intensity and elevation by pitch variance.
Transfer of learning The transfer of learning has been demonstrated by an experiment that offered significant results, suggesting that players were able to transfer the acquired spatial knowledge into real-world situations.
Tactile interactive multimedia (TIM) project
The TIM project, initiated by SITREC (Stockholm International Toy Research Center), proposes an approach based on the three modes of listening: casual, semantic and reduced listening. It tries to balance the issues of functionality and aesthetics and to convey information at a meta-level of perception.
Tim’s Journey is an adventure game that brings together complex interactive soundscapes, along with a “non-linear narration and open-ended gameplay” .
Virtual environment The game’s environment is an island that contains different scenes: a harbor, a forest and a mill.
Game scenario In this game, the player moves around freely in the setting to discover a hidden mystery.
User interface Both audio and graphical interface: proper user navigation is assured by an ambience reductor (lowers the volume of all objects that the user cannot interact with), footsteps (give information about the travel surface), helpers (characters that give information about the scenario) and foghorns (indicate a direction on the compass).
User interaction Keyboard interaction.
Sonification approach 3D directional sounds: each scene presents a different musical theme. The musical pieces change continuously, as they are a result of the sounds corresponding to each action in the environment. The importance of objects is emphasized by the intensity and repetition of their associated soundscapes.
Transfer of learning Not available.
Audio Doom  is an auditory-based correspondent of the exploratory game called “Doom”. The aim of Audio Doom is to involve blind children in performing navigational tasks to improve their spatial cognitive abilities, orientation, mobility and problem-solving skills.
Virtual environment A 3D labyrinth of walls and corridors.
Game scenario Players explore the virtual environment while trying to avoid monsters and reach the exit to the next level.
User interface A graphical and audio interface that allows the players to sequentially navigate through the game settings.
User interaction Keyboard, mouse and joystick.
Sonification approach Audio spectral cues (for example, the sound of opening doors or footsteps).
Transfer of learning An experiment in which seven blind children demonstrated that, by playing Audio Doom, the subjects improved their spatial cognitive skills and developed good tactile representation abilities. For instance, the participants in this experiment were able to rebuild the route they traveled in the game using Lego blocks.
Shades of Doom
Shades of Doom [19, 20] is a first-person shooter audio game inspired by the old game Doom.
Virtual environment A top-secret research base made of a large number of corridors, rooms and doorways.
Game scenario The purpose of the game is to guide the avatar through the virtual environment in order to stop a dangerous experiment.
User interface Both audio and video.
User interaction Keyboard.
Sonification approach Dynamic and realistic multi-layered 3D sound with up to 32 sounds playing simultaneously that can make use of a stereo or surround-sound system, use of Doppler for realistic movement sounds and synthesized 3D effects for non-surround sound systems.
Transfer of learning Not available.
Terraformers [19, 20], released in 2003, is an audio game accessible to both sighted and unsighted players that suffer from total blindness or from a low-vision disability.
Virtual environment 3D virtual environment in which the action takes place in the extraterrestrial space, on the imaginary planet Tellus 2.
Game scenario The players’ mission is to defeat the revolting robots that colonize the planet, to recover the computer components that they have scattered and to free their creator, Prof. Lange.
User interface Both visual and audio interfaces and a high-contrast mode that presents parts of the 3D graphics in black and white in order to be accessible to low-vision players.
User interaction Keyboard.
Sonification approach The sonification approach comprises 3D sound assigned to all the game’s objects, voice feedback, sound compass, voice commands that indicate direction (north, south, east or west), sonar – 3D sounds that indicate the distance to target objects, GPS position system that indicates the current location of the avatar though speech commands and ambient environmental audio cues.
Transfer of learning Not available.
The game strategy presented in Pyvox  is based on the toy-games approach, which attempts to create an enjoyable experience for the player while still maintaining a balance between the challenge of the game and the skills of the player (the flow theory). The creators of Pyvox believe that a game should become playable as soon as the user interacts with it, without any narrative instructions or previous formal training.
Virtual environment A 70-floor labyrinth tower corresponds to 70 different levels of the game; each level has an exit and a number of walls.
Game scenario The object of the game is to find the exit for each floor without hitting the walls. When the task is accomplished on one level, the player can ascend to the next level.
User interface Both audio and visual interfaces.
User interaction Mouse cursor movements for navigation.
Sonification approach No verbal instructions at all. The sonification approach that is assigned to the representation of the walls is reproduced by unpleasant sounds; the exits on one level are indicated by a stereo rendering that presents a variation of the pitch dimension of the sound according to the current position of the player in respect to the location of the exit.
Transfer of learning This game is based on the flow theory, which states that a game should be perceived as a toy and should not require any previous training, instructions or explanations. An experiment performed with two groups of blind users demonstrated that all players understood well the rules of the game and managed to pass the first three levels.
This game, which was developed by the researchers from Zform [6, 21] in 2002, is based on the Quake 1 open source engine. It creates a first-person perspective audio user interface available to both sighted and unsighted subjects.
Virtual environment A virtual maze consisting of rooms, corridors and exits.
Game scenario The purpose of the game is to collect a set of objects.
User interface Graphical user interface (GUI) and audio user interface (AUI). The game menu is presented by the voice of the main character (Momo, a monkey); multiplayer interaction.
User interaction Keyboard.
Sonification approach The sonification approach includes such features as a narrative introduction in which the player is told what objects he is required to collect, artificial stereo-panning designed to perceive 3D sounds, air conditioning vent sounds that are heard in the hallway indicate a free-path corridor, footstep sounds, “Ugh” directional, stereo-panned sounds that are heard when the player hits a wall, speech instructions for obtaining a heading direction (compass), sound occlusion between the walls of a room that prevents the player from hearing noises coming from other rooms, enjoyable ambient sounds specific to each room – kitchen, study room, office, each room presents a different type of footstep sound depending on the ground floor material (for example, a study has quiet footsteps, while a kitchen has more echoing footsteps), assigning each target object a representative sound and users can hear the presence and activity of other players in a multi-user environment.
Transfer of learning Not available.
Blind Side [22–25] is an audio game developed by Aaron Rasmussen and Michael Astolfi, the scenario of which was inspired by Aaron’s personal experience of being temporary blind after a high school chemistry accident. Blind Side aims to convey a realistic and immersive audio experience that transmits the emotions of rediscovering the world after waking up completely blind.
Virtual environment Audio-only virtual environment representing the settings of a building and a city.
Game scenario An assistant professor named Case wakes up totally blind near his girlfriend, Dawn. The goal of the game is to help Case and Dawn navigate through their environment and rediscover the world in their new condition as blind people.
User interface Audio interface only. Available on Mac, PC and for iOS mobile devices.
User interaction Keyboard, gyroscopic control scheme.
Sonification approach The sonification approach includes such features as 3D audio, over 1000 recorded, realistic sounds, narration system, low-pass filtering and attenuation for the sounds corresponding to the locations situated behind the listener, dynamic reverb, wall dampening and a trailing effect that simulates contact with surrounding objects, for example, the sound varies in pitch according to the angle and speed that the subjects are sliding along the surface of the objects.
Transfer of learning An experiment that took place at Smith-Kettlewell Eye Research Institute demonstrated that blind people usually finish the game earlier than non-blind players.
The audio-based games presented provide different user interaction approaches to navigation in a virtual environment scenario for visually impaired people. The plots involve either uncovering target sound-emitting sources, avoiding obstacles or finding an exit from a building by navigating through a series of rooms and corridors. Audio games require intensive attention and concentration to the perceived auditory stimuli. The sonification approach in these games is based on 3D directional audio, realistic recorded sounds, auditory icons, earcons and narrative speech that give advisory information on the current state of the game or introductory clues concerning the manipulation of the menu user interface. The 3D sound field that enwraps the player  enlarges the game world and amplifies the sensation of immersion . Furthermore, it enhances situational awareness by providing directional clues, enriches the amount of information conveyed through sound and compliments the 3D perception by extending it on the auditory level . Moreover, the dimensions of sound (amplitude, pitch) are altered to convey information regarding the current position of the objects and the relationship between the player and the targets on the meta-level of perception (distance, elevation). Nonetheless, it is important to take into account the role of ambient sounds that create a powerful immersive effect to introduce the player into the atmosphere generated by the game’s scenario. Moreover, the experiments conducted on visually impaired people who played audio games revealed a significant improvement in the acquisition of cognitive behavioral skills and an enhancement of contextual learning and spatial environmental representation. These results can be transferred for performing real-world navigation, a fact that fulfills the objectives of virtual auditory-based simulation.
In this paper, we have presented a series of audio games and virtual environments designed to provide a flexible and adjustable learning approach that would help visually impaired people to improve their ability to perform exploration and orientation tasks and to develop spatial navigation and problem-solving skills.
In conclusion, audio games represent an effective means toward progression in spatial contextual learning for visually impaired people. Navigating in a controlled game environment is a secure, risk-free task that provides a reliable rehabilitation method by improving functional independence and by developing orientation and mobility skills that can be transferred, as it has been demonstrated ([3, 9], in real-world situations). In addition to this, the challenging nature of the audio games promotes a high level of engagement, immersion and an active user interaction. The field of auditory-based games development is an area of research that requires the future attention of scientists from multidisciplinary domains, as we believe that it can help improve quality of life for blind people and, at the same time, shed light on the exceptional adaptive capacity of the brain.
The work has been funded by the Sectoral Operational Programme Human Resources Development 2007–2013 of the Ministry of European Funds through the Financial Agreement POSDRU/159/1.5/S/132395. This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 643636 “Sound of Vision”.
Author contributions: The author has accepted responsibility for the entire content of this submitted manuscript and approved submission.
Financial support: None declared.
Employment or leadership: None declared.
Honorarium: None declared.
Competing interests: The funding organization(s) played no role in the study design; in the collection, analysis, and interpretation of data; in the writing of the report; or in the decision to submit the report for publication.
Lessard N, Pare M, Lepore F, Lassonde M. Early-blind human subjects localize sound sources better than sighted subjects. Nature 1998;395:278–80.Google Scholar
Merabet LB, Sanchez J. Audio-based navigation using virtual environments: combining technology and neuroscience. AER J 2009;2:128–37.Google Scholar
Wersenyi G. Virtual localization by blind persons. J Audio Eng Soc 2012;60:568–79.Google Scholar
Grohn M, Lokki T, Takala T. Comparison of auditory, visual and audio-visual navigation in a 3D Space. In: Proceedings of the International Conference on Auditory Display (ICAD2003), Boston, MA: 200–3.Google Scholar
Playing by ear: creating blind-accessible games 2002. Available from: http://www.gamasutra.com/resource_guide/20020520/andersen_pfv.htm. Accessed: 20 June 2014.
Friberg J, Gardenfors D. Audio games: new perspectives on game audio. In: Proceedings of ACE 2004, Singapore, June 3–5, 2004.Google Scholar
Gaudy T, Natkin S, Archambault D. Pyvox 2: an audio game accessible to visually impaired people playable without visual nor verbal instructions. Transaction on edutainment II 2009; LNCS 5660:176–186.Google Scholar
Zone BB. Available from: https://www.zonebbs.com. Accessed: 2 September 2014.
PCS Games. Available from: http://www.pcsgames.net/game-co.htm. Accessed: 2 September 2014.
Cheiran J, Nedel L, Pimenta M. Inclusive games: a multimodal experience for blind players. In: Games and Digital Entertainment (SBGAMES), 2011:164–172. DOI: 10.1109/SBGAMES.2011.24.CrossrefGoogle Scholar
Wersenyi G. Auditory representations of a graphical user interface for a better human-computer interaction. Lect Note Comput Sci 2010;5954:80–102.Google Scholar
Wersenyi G. Evaluation of user habits for creating auditory representations of different software applications for blind persons. In: Proceedings of the 14th International Conference on Auditory Display (ICAD 08), Paris, France, June 23–28, 2008.Google Scholar
Connors EC, Chrastil RE, Sanchez J, Merabet LB. Action video game play and transfer of navigation and spatial cognition skills in adolescents who are blind. Front Hum Neurosci 2014;8:133.Web of SciencePubMedGoogle Scholar
Chen J. Flow in games (and everything else). Comm ACM 2007;50:31–4.Google Scholar
White GR Fitzpatrick G McAllister G. Toward accessible 3D virtual environments for the blind and visually impaired. In: Proceedings of the 3rd International Conference on Digital Interactive Media in Entertainment and Arts, ACM, (September 2008):134–141.Google Scholar
Dingler T, Lindsay J, Walker BN. Learnability of sound cues for environmental features: auditory icons, earcons, spearcons and speech. In: Proceedings of the International Conference on Auditory Display (ICAD 2008), Paris, France, 24–27 June.Google Scholar
Setting tradition on its ear: audio-based environments and gaming enhance navigation skills for the blind 2012. Available from: http://www.fctd.info/newsletters/302. Accessed: 20 June 2014.
Westin T. Game accessibility case study: terraformers- a real-time 3D graphics game. In: Proceedings of the Fifth International Conference on Disability, Virtual Reality and Associated Technologies, Oxford, UK, 2004.Google Scholar
Audio Games (Internet). Available from: http://audiogames.net. Accessed: 20 June 2014.
Sanchez J, Lumbreras M. Usability and cognitive impact of the interaction with 3D virtual interactive acoustic environments by blind children. In: Proceedings of the Third International Conference on Disability, Virtual Reality and Associated Technology, Alghero, Italy, 2000.Google Scholar
Blind side. Available from: http://michaeltastolfi.com/#/blindside. Accessed: 25 June 2014.
A sound solution: history of audio games for the visually impaired. Available from: http://artistryingames.com/sound-solution-history-audio-games-visually-impaired. Accessed: 27 June 2014.
A video game that you can’t even see. Available from: http://www.newyorker.com/online/blogs/elements/2013/12/where-are-the-games-for-disabled-players.html. Accessed: 27 June 2014.
Balan O, Moldoveanu F, Morar A, Asavei V. Experiments on training the sound localization abilities: a systematic review. In: eLSE Conference, Bucharest, 2014.Google Scholar
Moldoveanu A, Balan O, Moldoveanu F. Training system for improving spatial sound localization. In: eLSE Conference, Bucharest, 2014.Google Scholar
6 ways 3D audio can expand gaming experiences. Available from: http://www.gamasutra.com/blogs/MichelHenein/20131101/203770/6_Ways_3D_Audio_Can_Expand_Gaming_Experiences.php. Accessed: 29 June 2014.
About the article
Published Online: 2015-04-21
Published in Print: 2015-05-01