Subtitling Virtual Reality into Arabic: Eye Tracking 360-Degree Video for Exploring Viewing Experience

Amer Al-Adwan and El Mehdi Ibourk


The recent years have witnessed the emergence of new approaches in filmmaking including virtual reality (VR), which is meant to achieve an immersive viewing experience through advanced electronic devices, such as VR headsets. The VR industry is oriented toward developing content mainly in English and Japanese, leaving vast audiences unable to understand the original content or even enjoy this novel technology due to language barriers. This paper examines the impact of the subtitles on the viewing experience and behaviour of eight Arab participants in understanding the content in Arabic through eye tracking technology. It also provides an insight on the mechanism of watching a VR 360-degree documentary and the factors that lead viewers to favour one subtitling mode over the other in the spherical environment. For this end, a case study was designed to produce 120-degree subtitles and Follow Head Immediately subtitles, followed by the projection of the subtitled documentary through an eye tracking VR headset. The analysis of the eye tracking data is combined with post-viewing interviews in order to better understand the viewing experience of the Arab audience, their cognitive reception and the reasons leading to favour one type of subtitles over the other.

  • Al-Khalifah, Kholod S., and Hend S. Al-Khalifa. 2011. "The Effect of Arabic Language on Reading English for Arab EFL Learners: An Eye Tracking Study." In International Conference on Asian Language Processing, Penang: 287–90. DOI: 10.1109/IALP.2011.18

  • Al-Wabil, Areej, Ebtisam Alabdulqader, Latifa Al-Abdulkarim, and Nora Al-Twairesh. 2010. “Measuring the User Experience of Digital Books with Children: An Eye Tracking Study of Interaction with Digital Libraries.” In International Conference for Internet Technology and Secured Transactions (ICITST): 1–7.

  • Brown, Andy. “User Testing Subtitles for 360° Content.” BBC Research and Development. October 26, 2017.

  • Caffrey, Colm. 2012. “Using an Eye-Tracking Tool to Measure the Effects of Experimental Subtitling Procedures on Viewer Perception of Subtitled AV Content,” In E. Perego, (ed). Eye Tracking in Audio-Visual Translation: 223–58. Roma: Aracne.

  • Diaz Cintas, Jorge and Aline Remael. 2007. Audiovisual Translation: Subtitling. London: St. Jerome

  • Fox, Wendy. 2016. “Integrated titles: An improved viewing experience?” In Silvia Hansen-Schirra & Sambor Grucza (eds.) Eyetracking and Applied Linguistics: 5–30. Berlin: Language Science Press.

  • Gambier, Yves. 2003. “Screen Translation”. Accessed February 27, 2018.

  • Gambier, Yves. 1996. Les Transferts Linguistiques dans Les Medias Audiovisuels. Paris : Presses Universitaires du Septentrion.

  • Gottlieb, Henrik. 1994. “Subtitling: Diagonal Translation.” Perspectives: Studies in Translatology. 2(1): 101–21. DOI:10.1080/0907676X.1994.9961227.

  • Huey, Edmund Burke. 1907. "Hygienic Requirements in the Printing of Books and Papers." In Popular Science Monthly (70).

  • Kiefer Peter, Ioannis Giannopoulos, Martin Raubal & Andrew Duchowski. (2017). “Eye tracking for spatial research: Cognition, computation, challenges”. Spatial Cognition & Computation, 17:1–2, 1–19, DOI: 10.1080/13875868.2016.1254634

  • Kruger, Jan Louis, Agnieszka Szarkowska, and Izabela Krejtz. 2015. “Subtitles on the Moving Image: An Overview of Eye Tracking Studies.” Refractory: a Journal of Entertainment Media, 25.

  • Kruger, Jan-Louis, and Faans Steyn. 2014. “Subtitles and Eye Tracking: Reading and Performance.” Reading Research Quarterly 49 (1): 105–20. DOI: 10.1002/rrq.59.

  • Kruger, Jan-Louis, Esté Hefer, and Gordon Matthew. 2013. “Measuring the Impact of Subtitles on Cognitive Load: Eye Tracking and Dynamic Audiovisual Texts.” In Proceedings of Eye Tracking South Africa, Cape Town: 62–66. DOI: 10.1145/2509315.2509331

  • Kruger, Jan-Louis. 2012. “Making Meaning in AVT: Eye Tracking and Viewer Construction of Narrative,” Perspectives, 20(1): 67–86. Accessed September 26, 2017. DOI: 10.1080/0907676X.2011.632688.

  • McClarty, Rebecca. 2011. “Towards a Multipescipenary Approach in Creative Subtitling.” Mon TI 4. (2012): 133–53. Accessed January 19, 2018.

  • Rashbass, Cyril. 1961. “The Relationship Between Saccadic and Smooth Tracking Eye Movements.” Journal of Physiology, 159 (2): 326–38.

  • Rayner, Keith. 1998. “Eye Movements in Reading and Information Processing: 20 Years of Research.” Psychological Bulletin, 124(3): 372–422. DOI: 10.1037//0033–2909.124.3.372.

  • Salvucci, Dario D., and Joseph H. Goldberg. 2000. “Identifying Fixations and Saccades in Eye Tracking Protocols.” In Proceedings of the Eye Tracking Research and Applications Symposium: 71–78. New York: ACM Press.

  • Smith, Tim J. 2015. “Read, Watch, Listen: A Commentary on Eye Tracking and Moving Images.” Refractory: A Journal of Entertainment Media, 25(9).

Purchase article
Get instant unlimited access to the article.
Price including VAT
Log in
Already have access? Please log in.

Journal + Issues

Since its founding in 1956, Lebende Sprachen [Living Languages] has been the leading German journal for foreign languages in research and practice. It contains articles and reviews on language in general and also covers topics on specific languages and cultures, living languages and the life of language.