Multimodal Navigation Systems for Users with Visual Impairments—A Review and Analysis
Abstract
:1. Introduction
2. Multimodality in Human–Computer Interaction
3. Multimodality in Navigation Systems
3.1. Multimodal Navigation Systems
3.2. Interfaces
3.3. Maps
3.4. Virtual Learning Environments
4. Discussion
5. Recommendations and Challenges in the Design of Multimodal Navigation Systems
5.1. Recommendations
- Multimodality—multiple modalities should be available, and among them, audio feedback is always expected.
- Customisability—flexible customisation option should be available for user-preferred settings.
- Extendibility—it should be possible to extend a new feature or a new modality at a later stage.
- Portability—the whole system should be portable and should not create an extra burden to the user with many devices.
- Simplicity—adding additional modalities should not make the users feel that the system is complex or create confusion in selecting them.
- Dynamic mode selection—it should allow users to dynamically select the most appropriate mode of interaction for their current needs/environments.
- Adaptability—using machine learning techniques, multimodal systems can be designed to be adaptable based on varying environments.
- Privacy and security—it should address both the privacy and the security of the user.
5.2. Challenges
6. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Montello, D.R. Navigation. In The Cambridge Handbook of Visuospatial Thinking; Cambridge University Press: Cambridge, UK, 2005; pp. 257–294. [Google Scholar]
- Giudice, N.A.; Legge, G.E. Blind navigation and the role of technology. In The Engineering Handbook of Smart Technology for Aging, Disability, Additionally, Independence; Wiley: Hoboken, NJ, USA, 2008; Volume 8, pp. 479–500. [Google Scholar]
- Schinazi, V.R.; Thrash, T.; Chebat, D.R. Spatial navigation by congenitally blind individuals. WIREs Cogn. Sci. 2016, 7, 37–58. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Thinus-Blanc, C.; Gaunet, F. Representation of space in blind persons: Vision as a spatial sense? Psychol. Bull. 1997, 121, 20. [Google Scholar] [CrossRef] [PubMed]
- Long, R.G.; Hill, E. Establishing and maintaining orientation for mobility. In Foundations of Orientation and Mobility; American Foundation for the Blind: Arlington County, VA, USA, 1997; Volume 1. [Google Scholar]
- Giudice, N.A. Navigating without vision: Principles of blind spatial cognition. In Handbook of Behavioral and Cognitive Geography; Edward Elgar Publishing: Cheltenham, UK, 2018. [Google Scholar]
- Riazi, A.; Riazi, F.; Yoosfi, R.; Bahmeei, F. Outdoor difficulties experienced by a group of visually impaired Iranian people. J. Curr. Ophthalmol. 2016, 28, 85–90. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Manduchi, R.; Kurniawan, S. Mobility-related accidents experienced by people with visual impairment. AER J. Res. Pract. Vis. Impair. Blind. 2011, 4, 44–54. [Google Scholar]
- Dos Santos, A.D.P.; Medola, F.O.; Cinelli, M.J.; Ramirez, A.R.G.; Sandnes, F.E. Are electronic white canes better than traditional canes? A comparative study with blind and blindfolded participants. In Universal Access in the Information Society; Springer: Berlin/Heidelberg, Germany, 2020; pp. 1–11. [Google Scholar]
- Al-Ammar, M.A.; Al-Khalifa, H.S.; Al-Salman, A.S. A proposed indoor navigation system for blind individuals. In Proceedings of the 13th International Conference on Information Integration and Web-based Applications and Services, Ho Chi Minh City, Vitenam, 5–7 December 2011; pp. 527–530. [Google Scholar]
- Hersh, M.; Johnson, M.A. Assistive Technology for Visually Impaired and Blind People; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
- Wendt, O. Assistive Technology: Principles and Applications For Communication Disorders and Special Education; Brill: Leiden, The Netherlands, 2011. [Google Scholar]
- Chanana, P.; Paul, R.; Balakrishnan, M.; Rao, P. Assistive technology solutions for aiding travel of pedestrians with visual impairment. J. Rehabil. Assist. Technol. Eng. 2017, 4. [Google Scholar] [CrossRef]
- Bhowmick, A.; Hazarika, S.M. An insight into assistive technology for the visually impaired and blind people: State-of-the-art and future trends. J. Multimodal User Interfaces 2017, 11, 149–172. [Google Scholar] [CrossRef]
- Real, S.; Araujo, A. Navigation Systems for the Blind and Visually Impaired: Past Work, Challenges, and Open Problems. Sensors 2019, 19, 3404. [Google Scholar] [CrossRef] [Green Version]
- Hersh, M. The design and evaluation of assistive technology products and devices Part 1: Design. In International Encyclopedia of Rehabilitation; Blouin, M., Stone, J., Eds.; Center for International Rehabilitation Research Information and Exchange (CIRRIE), University at Buffalo: New York, NY, USA, 2010; Available online: http://sphhp.buffalo.edu/rehabilitation-science/research-and-facilities/funded-research-archive/center-for-international-rehab-research-info-exchange.html (accessed on 14 August 2020).
- Assistive Technology. 2018. Available online: https://www.who.int/news-room/fact-sheets/detail/assistive-technology (accessed on 14 August 2020).
- Lin, B.S.; Lee, C.C.; Chiang, P.Y. Simple smartphone-based guiding system for visually impaired people. Sensors 2017, 17, 1371. [Google Scholar] [CrossRef] [Green Version]
- Khan, I.; Khusro, S.; Ullah, I. Technology-assisted white cane: Evaluation and future directions. PeerJ 2018, 6, e6058. [Google Scholar] [CrossRef] [Green Version]
- Manduchi, R.; Coughlan, J. (Computer) vision without sight. Commun. ACM 2012, 55, 96–104. [Google Scholar] [CrossRef]
- Ton, C.; Omar, A.; Szedenko, V.; Tran, V.H.; Aftab, A.; Perla, F.; Bernstein, M.J.; Yang, Y. LIDAR Assist spatial sensing for the visually impaired and performance analysis. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 1727–1734. [Google Scholar] [CrossRef] [PubMed]
- Croce, D.; Giarré, L.; Pascucci, F.; Tinnirello, I.; Galioto, G.E.; Garlisi, D.; Valvo, A.L. An indoor and outdoor navigation system for visually impaired people. IEEE Access 2019, 7, 170406–170418. [Google Scholar] [CrossRef]
- Galioto, G.; Tinnirello, I.; Croce, D.; Inderst, F.; Pascucci, F.; Giarré, L. Sensor fusion localization and navigation for visually impaired people. In Proceedings of the 2018 European Control Conference (ECC), Limassol, Cyprus, 12–15 June 2018; pp. 3191–3196. [Google Scholar]
- Kuriakose, B.; Shrestha, R.; Sandnes, F.E. Smartphone Navigation Support for Blind and Visually Impaired People-A Comprehensive Analysis of Potentials and Opportunities. In International Conference on Human-Computer Interaction; Springer: Berlin/Heidelberg, Germany, 2020; pp. 568–583. [Google Scholar]
- Bernsen, N.O.; Dybkjær, L. Modalities and Devices; Springer: London, UK, 2010; pp. 67–111. [Google Scholar]
- What Is Multimodality. 2013. Available online: https://www.igi-global.com/dictionary/new-telerehabilitation-services-elderly/19644 (accessed on 14 August 2020).
- Mittal, S.; Mittal, A. Versatile question answering systems: Seeing in synthesis. Int. J. Intell. Inf. Database Syst. 2011, 5, 119–142. [Google Scholar] [CrossRef]
- Jaimes, A.; Sebe, N. Multimodal human–computer interaction: A survey. Comput. Vis. Image Understand. 2007, 108, 116–134. [Google Scholar] [CrossRef]
- Bourguet, M.L. Designing and Prototyping Multimodal Commands; IOS Press: Amsterdam, The Netherlands, 2003; Volume 3, pp. 717–720. [Google Scholar]
- Bourbakis, N.; Keefer, R.; Dakopoulos, D.; Esposito, A. A multimodal interaction scheme between a blind user and the tyflos assistive prototype. In Proceedings of the 2008 20th IEEE International Conference on Tools with Artificial Intelligence, Dayton, OH, USA, 3–5 November 2008; Volume 2, pp. 487–494. [Google Scholar]
- Basori, A.H. HapAR: Handy Intelligent Multimodal Haptic and Audio-Based Mobile AR Navigation for the Visually Impaired. In Technological Trends in Improved Mobility of the Visually Impaired; Springer: Cham, Switzerland, 2020; pp. 319–334. [Google Scholar]
- Fusiello, A.; Panuccio, A.; Murino, V.; Fontana, F.; Rocchesso, D. A multimodal electronic travel aid device. In Proceedings of the Fourth IEEE International Conference on Multimodal Interfaces, Pittsburgh, PA, USA, 16 October 2002; pp. 39–44. [Google Scholar]
- Nair, V.; Budhai, M.; Olmschenk, G.; Seiple, W.H.; Zhu, Z. ASSIST: Personalized indoor navigation via multimodal sensors and high-level semantic information. In Proceedings of the European Conference on Computer Vision (ECCV); Springer International Publishing: Cham, Switzerland, 2018; pp. 128–143. [Google Scholar]
- Caraiman, S.; Morar, A.; Owczarek, M.; Burlacu, A.; Rzeszotarski, D.; Botezatu, N.; Herghelegiu, P.; Moldoveanu, F.; Strumillo, P.; Moldoveanu, A. Computer vision for the visually impaired: The sound of vision system. In Proceedings of the IEEE International Conference on Computer Vision Workshops, Venice, Italy, 22–29 October 2017; pp. 1480–1489. [Google Scholar]
- Kuriakose, B.; Shrestha, R.; Sandnes, F.E. Tools and Technologies for Blind and Visually Impaired Navigation Support: A Review. IETE Tech. Rev. 2020, 1–16. [Google Scholar] [CrossRef]
- Karray, F.; Alemzadeh, M.; Saleh, J.A.; Arab, M.N. Human-computer interaction: Overview on state of the art. Int. J. Smart Sens. Intell. Syst. 2008, 1, 137–159. [Google Scholar] [CrossRef] [Green Version]
- Oviatt, S.; Lunsford, R.; Coulston, R. Individual differences in multimodal integration patterns: What are they and why do they exist? In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Portland, OR, USA, 2–7 April 2005; pp. 241–249. [Google Scholar]
- Bohus, D.; Horvitz, E. Facilitating multiparty dialog with gaze, gesture, and speech. In Proceedings of the International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction, Beijing, China, 2–12 November 2010; pp. 1–8. [Google Scholar]
- Oviatt, S.; Cohen, P.; Wu, L.; Duncan, L.; Suhm, B.; Bers, J.; Holzman, T.; Winograd, T.; Landay, J.; Larson, J.; et al. Designing the user interface for multimodal speech and pen-based gesture applications: State-of-the-art systems and future research directions. Hum. Comput. Interact. 2000, 15, 263–322. [Google Scholar] [CrossRef]
- Huang, D.S.; Jo, K.H.; Figueroa-García, J.C. Intelligent Computing Theories and Application. In Proceedings of the 13th International Conference, ICIC 2017, Liverpool, UK, 7–10 August 2017; Springer: Berlin/Heidelberg, Germany, 2017; Volume 10362. [Google Scholar]
- Palanque, P.; Graham, T.C.N. Interactive Systems. Design, Specification, and Verification. In Proceedings of the 7th International Workshop, DSV-IS 2000, Limerick, Ireland, 5–6 June 2000; Revised Papers, Number 1946. Springer Science & Business Media: Berlin/Heidelberg, Germany, 2001. [Google Scholar]
- Bernsen, N.O. Multimodality theory. In Multimodal User Interfaces; Springer: Berlin/Heidelberg, Germany, 2008; pp. 5–29. [Google Scholar]
- Jacko, J.A. Human Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications; CRC Press: Boca Raton, FL, USA, 2012. [Google Scholar]
- Kurosu, M. Human-Computer Interaction: Interaction Modalities and Techniques. In Proceedings of the 15th International Conference, HCI International 2013, Las Vegas, NV, USA, 12–26 July 2013; Springer: Berlin/Heidelberg, Germany, 2013; Volume 8007. [Google Scholar]
- Bainbridge, W.S. Berkshire Encyclopedia of Human-Computer Interaction; Berkshire Publishing Group LLC: Great Barrington, MA, USA, 2004; Volume 1. [Google Scholar]
- Epstein, Z. Siri Said to Be Driving Force behind Huge iPhone 4S Sales. 2011. Available online: https://bgr.com/2011/11/02/siri-said-to-be-driving-force-behind-huge-iphone-4s-sales/ (accessed on 14 August 2020).
- Grifoni, P.; Ferri, F.; Caschera, M.C.; D’Ulizia, A.; Mazzei, M. MIS: Multimodal Interaction Services in a cloud perspective. arXiv 2014, arXiv:1704.00972. [Google Scholar]
- Hoy, M.B. Alexa, Siri, Cortana, and more: An introduction to voice assistants. Med. Ref. Serv. Q. 2018, 37, 81–88. [Google Scholar] [CrossRef]
- Kepuska, V.; Bohouta, G. Next-generation of virtual personal assistants (microsoft cortana, apple siri, amazon alexa and google home). In Proceedings of the 2018 IEEE 8th Annual Computing and Communication Workshop and Conference (CCWC), Las Vegas, NV, USA, 8–10 January 2018; pp. 99–103. [Google Scholar]
- Kurkovsky, S. Multimodality in Mobile Computing and Mobile Devices: Methods for Adaptable Usability; IGI Global: Hershey, PA, USA, 2010. [Google Scholar]
- Djaid, N.T.; Saadia, N.; Ramdane-Cherif, A. Multimodal Fusion Engine for an Intelligent Assistance Robot Using Ontology. Procedia Comput. Sci. 2015, 52, 129–136. [Google Scholar] [CrossRef] [Green Version]
- Corradini, A.; Mehta, M.; Bernsen, N.O.; Martin, J.; Abrilian, S. Multimodal input fusion in human-computer interaction. NATO Sci. Ser. Sub Ser. III Comput. Syst. Sci. 2005, 198, 223. [Google Scholar]
- D’Ulizia, A. Exploring multimodal input fusion strategies. In Multimodal Human Computer Interaction and Pervasive Services; IGI Global: Hershey, PA, USA, 2009; pp. 34–57. [Google Scholar]
- Caschera, M.C.; Ferri, F.; Grifoni, P. Multimodal interaction systems: Information and time features. Int. J. Web Grid Serv. 2007, 3, 82–99. [Google Scholar] [CrossRef]
- Grifoni, P. Multimodal Human Computer Interaction and Pervasive Services; IGI Global: Hershey, PA, USA, 2009. [Google Scholar]
- Dumas, B.; Lalanne, D.; Oviatt, S. Multimodal interfaces: A survey of principles, models and frameworks. In Human Machine Interaction; Springer: Berlin/Heidelberg, Germany, 2009; pp. 3–26. [Google Scholar]
- Vainio, T. Exploring cues and rhythm for designing multimodal tools to support mobile users in wayfinding. In CHI’09 Extended Abstracts on Human Factors in Computing Systems; ACM: New York, NY, USA, 2009; pp. 3715–3720. [Google Scholar]
- Brock, A.M.; Truillet, P.; Oriola, B.; Picard, D.; Jouffrais, C. Interactivity improves usability of geographic maps for visually impaired people. Hum. Comput. Interact. 2015, 30, 156–194. [Google Scholar] [CrossRef]
- Paternó, F. Interactive Systems: Design, Specification, and Verification. In Proceedings of the 1st Eurographics Workshop, Bocca Di Magra, Italy, 8–10 June 1994; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
- Sears, A.; Jacko, J.A. Human-Computer Interaction: Designing for Diverse Users and Domains; CRC Press: Boca Raton, FL, USA, 2009. [Google Scholar]
- Van der Bie, J.; Ben Allouch, S.; Jaschinski, C. Communicating Multimodal Wayfinding Messages For Visually Impaired People Via Wearables. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services, Taipei Taiwan, 1–4 October 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–7. [Google Scholar]
- Tang, M. Benefits of Bone Conduction and Bone Conduction Headphones. 2019. Available online: https://www.soundguys.com/bone-conduction-headphones-20580/ (accessed on 14 August 2020).
- Gallo, S.; Chapuis, D.; Santos-Carreras, L.; Kim, Y.; Retornaz, P.; Bleuler, H.; Gassert, R. Augmented white cane with multimodal haptic feedback. In Proceedings of the 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics, Tokyo, Japan, 26–29 September 2010; pp. 149–155. [Google Scholar]
- Zeng, L.; Weber, G.; Simros, M.; Conradie, P.; Saldien, J.; Ravyse, I.; van Erp, J.; Mioch, T. Range-IT: Detection and multimodal presentation of indoor objects for visually impaired people. In Proceedings of the MobileHCI ’17: 19th International Conference on Human-Computer Interaction with Mobile Devices and Services, Vienna, Austria, 4–7 September 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 1–6. [Google Scholar]
- Hosseini, S.M.F.; Riener, A.; Bose, R.; Jeon, M. “Listen2dRoom”: Helping Visually Impaired People Navigate Indoor Environments Using an Ultrasonic Sensor-Based Orientation Aid; Georgia Institute of Technology: Atlanta, GA, USA, 2014. [Google Scholar]
- Ahmetovic, D.; Gleason, C.; Ruan, C.; Kitani, K.; Takagi, H.; Asakawa, C. NavCog: A navigational cognitive assistant for the blind. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services, Florence, Italy, 6–9 September 2016; pp. 90–99. [Google Scholar]
- Chang, Y.; Chen, J.; Franklin, T.; Zhang, L.; Ruci, A.; Tang, H.; Zhu, Z. Multimodal Information Integration for Indoor Navigation Using a Smartphone. In Proceedings of the 2020 IEEE 21st International Conference on Information Reuse and Integration for Data Science (IRI), Las Vegas, NV, USA, 11–13 August 2020; pp. 59–66. [Google Scholar]
- Loomis, J.M.; Golledge, R.G.; Klatzky, R.L. Navigation system for the blind: Auditory display modes and guidance. Presence 1998, 7, 193–203. [Google Scholar] [CrossRef]
- Loomis, J.M.; Marston, J.R.; Golledge, R.G.; Klatzky, R.L. Personal guidance system for people with visual impairment: A comparison of spatial displays for route guidance. J. Vis. Impair. Blind. 2005, 99, 219–232. [Google Scholar] [CrossRef]
- Wang, H.C.; Katzschmann, R.K.; Teng, S.; Araki, B.; Giarré, L.; Rus, D. Enabling independent navigation for visually impaired people through a wearable vision-based feedback system. In Proceedings of the 2017 IEEE international conference on robotics and automation (ICRA), Singapore, 29 May–3 June 2017; pp. 6533–6540. [Google Scholar]
- Diaz, C.; Payandeh, S. Multimodal Sensing Interface for Haptic Interaction. J. Sens. 2017, 2017. [Google Scholar] [CrossRef]
- Lock, J.C.; Cielniak, G.; Bellotto, N. A Portable Navigation System with an Adaptive Multimodal Interface for the Blind; 2017 AAAI Spring Symposium Series; Stanford, CA, USA, 27–29 March 2017; AAAI: Palo Alto, CA, USA, 2017. [Google Scholar]
- Bellotto, N. A multimodal smartphone interface for active perception by visually impaired. In IEEE SMC International Workshop on Human-Machine Systems, Cyborgs and Enhancing Devices (HUMASCEND); IEEE: Manchester, UK, 2013. [Google Scholar]
- Turunen, M.; Hakulinen, J.; Kainulainen, A.; Melto, A.; Hurtig, T. Design of a rich multimodal interface for mobile spoken route guidance. In Proceedings of the Eighth Annual Conference of the International Speech Communication Association, Antwerp, Belgium, 27–31 August 2007. [Google Scholar]
- Ducasse, J.; Brock, A.M.; Jouffrais, C. Accessible interactive maps for visually impaired users. In Mobility of Visually Impaired People; Springer: Berlin/Heidelberg, Germany, 2018; pp. 537–584. [Google Scholar]
- Brock, A.; Truillet, P.; Oriola, B.; Picard, D.; Jouffrais, C. Design and user satisfaction of interactive maps for visually impaired people. In International Conference on Computers for Handicapped Persons; Springer: Berlin/Heidelberg, Germany, 2012; pp. 544–551. [Google Scholar]
- Wang, Z.; Li, B.; Hedgpeth, T.; Haven, T. Instant tactile-audio map: Enabling access to digital maps for people with visual impairment. In Proceedings of the 11th International ACM SIGACCESS Conference on Computers and Accessibility, Pittsbuirgh, PA, USA, 25–28 October 2009; pp. 43–50. [Google Scholar]
- Miele, J.A.; Landau, S.; Gilden, D. Talking TMAP: Automated generation of audio-tactile maps using Smith-Kettlewell’s TMAP software. Br. J. Vis. Impair. 2006, 24, 93–100. [Google Scholar] [CrossRef]
- Giudice, N.A.; Guenther, B.A.; Jensen, N.A.; Haase, K.N. Cognitive mapping without vision: Comparing wayfinding performance after learning from digital touchscreen-based multimodal maps vs. embossed tactile overlays. Front. Hum. Neurosci. 2020, 14, 87. [Google Scholar] [CrossRef] [Green Version]
- Poppinga, B.; Magnusson, C.; Pielot, M.; Rassmus-Gröhn, K. TouchOver map: Audio-tactile exploration of interactive maps. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, Stockholm, Sweden, 30 August–2 September 2011; pp. 545–550. [Google Scholar]
- Zeng, L.; Weber, G. Exploration of location-aware you-are-here maps on a pin-matrix display. IEEE Trans. Hum. Mach. Syst. 2015, 46, 88–100. [Google Scholar] [CrossRef]
- Bahram, S. Multimodal eyes-free exploration of maps: TIKISI for maps. ACM SIGACCESS Access. Comput. 2013, 3–11. [Google Scholar] [CrossRef]
- Yatani, K.; Banovic, N.; Truong, K. SpaceSense: Representing geographical information to visually impaired people using spatial tactile feedback. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Austin, TX, USA, 5–10 May 2012; pp. 415–424. [Google Scholar]
- Kunz, A.; Miesenberger, K.; Zeng, L.; Weber, G. Virtual navigation environment for blind and low vision people. In International Conference on Computers Helping People with Special Needs; Springer: Berlin/Heidelberg, Germany, 2018; pp. 114–122. [Google Scholar]
- Lécuyer, A.; Mobuchon, P.; Mégard, C.; Perret, J.; Andriot, C.; Colinot, J.P. HOMERE: A multimodal system for visually impaired people to explore virtual environments. In Proceedings of the IEEE Virtual Reality, Los Angeles, CA, USA, 22–26 March 2003; pp. 251–258. [Google Scholar]
- Rivière, M.A.; Gay, S.; Romeo, K.; Pissaloux, E.; Bujacz, M.; Skulimowski, P.; Strumillo, P. NAV-VIR: An audio-tactile virtual environment to assist visually impaired people. In Proceedings of the 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER), San Francisco, CA, USA, 20–23 March 2019; pp. 1038–1041. [Google Scholar]
- Khoo, W.L.; Seidel, E.L.; Zhu, Z. Designing a virtual environment to evaluate multimodal sensors for assisting the visually impaired. In International Conference on Computers for Handicapped Persons; Springer: Berlin/Heidelberg, Germany, 2012; pp. 573–580. [Google Scholar]
- Zhao, Y.; Bennett, C.L.; Benko, H.; Cutrell, E.; Holz, C.; Morris, M.R.; Sinclair, M. Enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 1–14. [Google Scholar]
- Lahav, O.; Schloerb, D.; Kumar, S.; Srinivasan, M. A virtual environment for people who are blind–a usability study. J. Assist. Technol. 2012, 6. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ho, C.; Reed, N.; Spence, C. Multisensory in-car warning signals for collision avoidance. Hum. Factors 2007, 49, 1107–1114. [Google Scholar] [CrossRef] [PubMed]
- Stanney, K.; Samman, S.; Reeves, L.; Hale, K.; Buff, W.; Bowers, C.; Goldiez, B.; Nicholson, D.; Lackey, S. A paradigm shift in interactive computing: Deriving multimodal design principles from behavioral and neurological foundations. Int. J. Hum. Comput. Interact. 2004, 17, 229–257. [Google Scholar] [CrossRef]
- Calvert, G.; Spence, C.; Stein, B.E. The Handbook of Multisensory Processes; MIT Press: Cambridge, MA, USA, 2004. [Google Scholar]
- Lee, J.H.; Spence, C. Assessing the benefits of multimodal feedback on dual-task performance under demanding conditions. In Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction (HCI), Liverpool, UK, 1–5 September 2008; pp. 185–192. [Google Scholar]
- Oviatt, S.; Schuller, B.; Cohen, P.; Sonntag, D.; Potamianos, G. The Handbook of Multimodal-Multisensor Interfaces, Volume 1: Foundations, User Modeling, and Common Modality Combinations; Morgan & Claypool: San Rafael, CA, USA, 2017. [Google Scholar]
- Rodrigues, J.; Cardoso, P.; Monteiro, J.; Figueiredo, M. Handbook of Research on Human-Computer Interfaces, Developments, and Applications; IGI Global: Hershey, PA, USA, 2016. [Google Scholar]
- Common Sense Suggestions for Developing Multimodal User Interfaces. 2016. Available online: https://www.w3.org/TR/mmi-suggestions/ (accessed on 14 August 2020).
- Havik, E.M.; Kooijman, A.C.; Steyvers, F.J. The effectiveness of verbal information provided by electronic travel aids for visually impaired persons. J. Vis. Impair. Blind. 2011, 105, 624–637. [Google Scholar] [CrossRef]
- Adebiyi, A.; Sorrentino, P.; Bohlool, S.; Zhang, C.; Arditti, M.; Goodrich, G.; Weiland, J.D. Assessment of feedback modalities for wearable visual aids in blind mobility. PLoS ONE 2017, 12, e0170531. [Google Scholar] [CrossRef] [PubMed]
- Jacob, S.V.; MacKenzie, I.S. Comparison of Feedback Modes for the Visually Impaired: Vibration vs. Audio. In International Conference on Universal Access in Human-Computer Interaction; Springer: Berlin/Heidelberg, Germany, 2018; pp. 420–432. [Google Scholar]
- Wentzel, J.; Velleman, E.; van der Geest, T. Developing accessibility design guidelines for wearables: Accessibility standards for multimodal wearable devices. In International Conference on Universal Access in Human-Computer Interaction; Springer International Publishing: Cham, Switzerland, 2016; pp. 109–119. [Google Scholar]
- Schneider-Hufschmidt, M. Human factors (hf): Multimodal interaction, communication and navigation guidelines. In Proceedings of the 19th International Symposium on Human Factors in Telecommunication, Berlin/Heidelberg, Germany, 1–4 December 2003; European Telecommunications Standards Institute: Sophia Antipolis, France, 2003; Volume 1, pp. 1–53. [Google Scholar]
- Lahat, D.; Adali, T.; Jutten, C. Multimodal data fusion: An overview of methods, challenges, and prospects. Proc. IEEE 2015, 103, 1449–1477. [Google Scholar] [CrossRef] [Green Version]
- Gjoreski, H.; Ciliberto, M.; Wang, L.; Ordonez Morales, F.J.; Mekki, S.; Valentin, S.; Roggen, D. The University of Sussex-Huawei Locomotion and Transportation Dataset for Multimodal Analytics With Mobile Devices. IEEE Access 2018, 6, 42592–42604. [Google Scholar] [CrossRef]
- Rouat, S.B.S.C.J. CREATE: Multimodal Dataset for Unsupervised Learning and Generative Modeling of Sensory Data from a Mobile Robot. IEEE Dataport 2018. [Google Scholar] [CrossRef]
- Cheng, R.; Wang, K.; Bai, J.; Xu, Z. OpenMPR: Recognize places using multimodal data for people with visual impairments. Meas. Sci. Technol. 2019, 30, 124004. [Google Scholar] [CrossRef] [Green Version]
- Caspo, A.; Wersényi, G.; Jeon, M. A survey on hardware and software solutions for multimodal wearable assistive devices targeting the visually impaired. Acta Polytech. Hung. 2016, 13, 39. [Google Scholar]
- Liljedahl, M.; Lindberg, S.; Delsing, K.; Polojärvi, M.; Saloranta, T.; Alakärppä, I. Testing two tools for multimodal navigation. Adv. Hum. Comput. Interact. 2012, 2012. [Google Scholar] [CrossRef] [Green Version]
- Gallacher, S.; Papadopoulou, E.; Taylor, N.K.; Williams, M.H. Learning user preferences for adaptive pervasive environments: An incremental and temporal approach. ACM Trans. Auton. Adapt. Syst. TAAS 2013, 8, 1–26. [Google Scholar] [CrossRef]
- Yao, Y.; Zhao, Y.; Wang, J.; Han, S. A model of machine learning based on user preference of attributes. In International Conference on Rough Sets and Current Trends in Computing; Springer: Berlin/Heidelberg, Germany, 2006; pp. 587–596. [Google Scholar]
- Phillips, B.; Zhao, H. Predictors of assistive technology abandonment. Assist. Technol. 1993, 5, 36–45. [Google Scholar] [CrossRef]
System | Main Software/Hardware Components | Localisation Technologies | Modalities Involved |
---|---|---|---|
Tested with target users | |||
EyeBeacons [61] | Smartphone, bone conduction headset, smartwatch, SideWalk wayfinding framework. | IMU | Aural, Visual and Tactile |
ASSIST [33] | Smartphone, BLE beacons, Google Tango. | IMU, BLE beacons | Aural, Visual and Vibration Alerts |
Sound of Vision [34] | IR-based depth sensor, stereo cameras and IMU device. | IMU | Aural and Tactile |
NavCog [66] | Smartphone, BLE beacons. | BLE beacons | Sound alerts and Verbal Cues |
Personal Guidance System [68] | GPS receiver, GIS, keypad, earphones, speech synthesizer, acoustic display hardware. | GPS | Aural and Verbal Commands |
Wearable Vision-based System [70] | Embedded computer, depth sensor camera, haptic array, braille display. | Vision-based | Haptic, Braille and aural |
Tested with blindfolded users | |||
HapAR [31] | Smartphone, voice recognizer. | IMU | Voice, Audio and Vibration |
Personal Radar [65] | Ultrasonic sensors, tactile actuators, and Arduino ATmega2560 microcontroller. | Ultrasonic-based | Aural and Vibrotactile |
Trail Evaluation | |||
Augmented White Cane [63] | White cane, distance and obstacle sensors, shock device electronics, vibrating motors. | Ultrasonic-based | Shock Simulations, Vibrotactile, Audio Alerts |
Electronic Travel Aid [32] | Earphones, sunglasses fitted with two micro cameras and palmtop computer. | Vision-based | Aural and Visual |
Range-IT [64] | 3D depth camera, 9 Degrees of Freedom IMU, bone conduction headphone, vibrotactile belt, smartphone. | Vision-based | Aural and Vibrotactile |
No evaluations reported | |||
Tyflos [30] | Stereo cameras, microphone, ear speakers, portable computer and vibration array vest. | Vision-based | Aural and Vibration |
iASSIST [67] | Smartphone, ARKit, Bluetooth beacons, 2D/3D models. | Beacons, Wi-Fi/cellular | Voice and Vibrations |
System | Main Software/Hardware Components | Modalities Involved |
---|---|---|
No evaluations reported | ||
ActiVis [72] | Google Tango, bone conducting headset. | Aural and Vibration Cues |
Human-in-the-Loop [73] | Smartphone, headset, IVONA TTS engine, OpenAL. | Vocal Messages, Aural and Vibrations |
TravelMan [74] | Smartphone with camera, Bluetooth GPS device. | Aural, Visual and Tactile |
System | Main Software/Hardware Components | Modalities Involved |
---|---|---|
Tested with target users | ||
Interactive Map [76] | Multi-touch screen, inkscape editor, TTS engine, middleware. | Aural and Tactile |
Instant Tactile-Audio Map [77] | Tactile touchpad, SVG, tactile embosser. | Aural and Tactile |
Vibro-Audio Map (VAM) [79] | Tablet | Aural and Tactile |
TouchOver Map [80] | Android device, OpenStreetMap | Aural and Tactile |
You-Are-Here (YAH) Map [81] | Touch-Sensitive Pin-Matrix Display, Mobile Phone, Wiimote Cane, Computer, OpenStreetMap | Aural and Tactile |
Tikisi [82] | TikiSi Framework | Multitouch Gestures, Voice Commands, Speech |
SpaceSense [83] | iPhone device, vibration motors, FliteTTS1 package | Aural and Tactile |
No evaluations reported | ||
Talking TMAP [78] | Braille embossers, SVG, tactile tablet, TMAP software, macromedia director. | Aural and Tactile |
System | Main Software/Hardware Components | Modalities Involved |
---|---|---|
Tested with target users | ||
NAV-VIR [86] | Tablet, 2 servomotors moving a flat joystick, Arduino single-board microcontroller, and immersive HRTF-based 3D audio simulation, Google VR Audio. | Force, 3D Audio Cues |
HOMERE [85] | VIRTUOSE 3D, Haptic device, infrared lamps, speakers, gamepad, Sense8 for graphics rendering and VORTEX 1.5 for collision detection speakers. | Force, Thermal, Aural, Haptic and Visual |
Canetroller [88] | Folding canes, magnetic particle brake, voice coil, Vive tracker, VR headset for 3D audio, IMU with gyro and accelerometer, Unity game engine. | Breaking feedback, Vibrotactile and 3D Audio |
BlindAid [89] | Computer, desktop phantom. | Aural and Haptic |
Tested with blindfolded users | ||
Virtual Navigation Environment [84] | Intersense DOF tracking system, Oculus Rift DK, headphones, Arduino Uno microcontroller, Unity game engine. | Aural and Haptic |
No evaluations reported | ||
Multimodal Sensors VE [87] | XBox controller, Microsoft Kinect, head mounted electrodes, Brainport’s vision technology. | Aural, Vibration and Haptics |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kuriakose, B.; Shrestha, R.; Sandnes, F.E. Multimodal Navigation Systems for Users with Visual Impairments—A Review and Analysis. Multimodal Technol. Interact. 2020, 4, 73. https://doi.org/10.3390/mti4040073
Kuriakose B, Shrestha R, Sandnes FE. Multimodal Navigation Systems for Users with Visual Impairments—A Review and Analysis. Multimodal Technologies and Interaction. 2020; 4(4):73. https://doi.org/10.3390/mti4040073
Chicago/Turabian StyleKuriakose, Bineeth, Raju Shrestha, and Frode Eika Sandnes. 2020. "Multimodal Navigation Systems for Users with Visual Impairments—A Review and Analysis" Multimodal Technologies and Interaction 4, no. 4: 73. https://doi.org/10.3390/mti4040073
APA StyleKuriakose, B., Shrestha, R., & Sandnes, F. E. (2020). Multimodal Navigation Systems for Users with Visual Impairments—A Review and Analysis. Multimodal Technologies and Interaction, 4(4), 73. https://doi.org/10.3390/mti4040073