skip to main content
10.1145/3308561.3353788acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article

Leveraging Augmented Reality to Create Apps for People with Visual Disabilities: A Case Study in Indoor Navigation

Published:24 October 2019Publication History

ABSTRACT

The introduction of augmented reality technology to iOS and Android enables, for the first time, mainstream smartphones to estimate their own motion in 3D space with high accuracy. For assistive technology researchers, this development presents a potential opportunity. In this spirit, we present our work leveraging these technologies to create a smartphone app to empower people who are visually impaired to more easily navigate indoor environments. Our app, Clew, allows users to record routes and then load them, at any time, providing automatic guidance (using haptic, speech, and sound feedback) along the route. We present our user-centered design process, Clew's system architecture and technical details, and both small and large-scale evaluations of the app. We discuss opportunities, pitfalls, and design guidelines for utilizing augmented reality for orientation and mobility apps. Our work expands the capabilities of technology for orientation and mobility that can be distributed on a mass scale.

References

  1. Dragan Ahmetovic, Cole Gleason, Chengxiong Ruan, Kris M Kitani, Hironobu Takagi, and Chieko Asakawa. 2016. NavCog: a navigational cognitive assistant for the blind.. In MobileHCI. 90--99.Google ScholarGoogle Scholar
  2. American Federation for the Blind. 2017. Interpreting Bureau of Labor Statistics Employment Data. (January 2017). http://www.afb.org/info/blindness-statistics/interpreting-bls-employment-data/24Google ScholarGoogle Scholar
  3. J Malvern Benjamin. 1973. The new C-5 laser cane for the blind. In Proc. Carnahan Conf. on Electronic Prosthetics. 77--82.Google ScholarGoogle Scholar
  4. Jeffrey P Bigham, Chandrika Jayant, Hanjie Ji, Greg Little, Andrew Miller, Robert C Miller, Robin Miller, Aubrey Tatarowicz, Brandyn White, Samual White, and others. 2010. VizWiz: nearly real-time answers to visual questions. In Proceedings of the 23nd annual ACM symposium on User interface software and technology. ACM, 333--342.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Michael Bloesch, Sammy Omari, Marco Hutter, and Roland Siegwart. 2015. Robust visual inertial odometry using a direct EKF-based approach. In Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on. IEEE, 298--304.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Johann Borenstein and Iwan Ulrich. 1997. The guidecane-a computerized travel aid for the active guidance of blind pedestrians. In Robotics and Automation, 1997. Proceedings., 1997 IEEE International Conference on, Vol. 2. IEEE, 1283--1288.Google ScholarGoogle ScholarCross RefCross Ref
  7. Aira Tech Corp. 2018. aira: your life, your schedule, right now. https://aira.io/. (2018).Google ScholarGoogle Scholar
  8. Adele Crudden and Lynn W McBroom. 1999. Barriers to employment: A survey of employed persons who are visually impaired. Journal of Visual Impairment and Blindness 93 (1999), 341--350.Google ScholarGoogle ScholarCross RefCross Ref
  9. Adele Crudden, Lynn W McBroom, Amy L Skinner, and J Elton Moore. 1998. Comprehensive Examination of Barriers to Employment among Persons Who Are Blind or Visually Impaired. Mississippi State: Rehabilitation Research and Training Center on Blindness and Low Vision, University of Mississippi. (1998).Google ScholarGoogle Scholar
  10. Tech Crunch. 2017. Android Guys. http://www.androidguys.com/2017/08/30/google-rebrands-tango-as-arcore/. (August 2017).Google ScholarGoogle Scholar
  11. Dimitrios Dakopoulos and Nikolaos G Bourbakis. 2010. Wearable obstacle avoidance electronic travel aids for blind: a survey. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on 40, 1 (2010), 25--35.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. M Bernardine Dias. 2014. NavPal: Technology Solutions for Enhancing Urban Navigation. Technical Report Carnegie Mellon University-RI-TR-21. Robotics Institute, Pittsburgh, PA.Google ScholarGoogle Scholar
  13. David H Douglas and Thomas K Peucker. 1973. Algorithms for the reduction of the number of points required to represent a digitized line or its caricature. Cartographica: The International Journal for Geographic Information and Geovisualization 10, 2 (1973), 112--122.Google ScholarGoogle ScholarCross RefCross Ref
  14. German Flores and Roberto Manduchi. 2018. Easy Return: An App for Indoor Backtracking Assistance. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 17.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. German Flores, Roberto Manduchi, and Enrique D Zenteno. 2014. Ariadne's thread: Robust turn detection for path back-tracing using the iPhone. Proceedings of the IEEE Ubiquitous Positioning Indoor Navigation and Location Based Service (2014).Google ScholarGoogle ScholarCross RefCross Ref
  16. Christian Forster, Matia Pizzoli, and Davide Scaramuzza. 2014. SVO: Fast semi-direct monocular visual odometry. In Robotics and Automation (ICRA), 2014 IEEE International Conference on. IEEE, 15--22.Google ScholarGoogle ScholarCross RefCross Ref
  17. Giovanni Fusco and James M Coughlan. 2018. Indoor Localization Using Computer Vision and Visual-Inertial Odometry. In International Conference on Computers Helping People with Special Needs. Springer, 86--93.Google ScholarGoogle Scholar
  18. Aura Ganz, Siddhesh Rajan Gandhi, James Schafer, Tushar Singh, Elaine Puleo, Gary Mullett, and Carole Wilson. 2011. PERCEPT: Indoor navigation for the blind and visually impaired. In Engineering in Medicine and Biology Society, EMBC, 2011 Annual International Conference of the IEEE. IEEE, 856--859.Google ScholarGoogle Scholar
  19. Aura Ganz, James M Schafer, Yang Tao, Larry Haile, Charlene Sanderson, Carole Wilson, and Meg Robertson. 2015. PERCEPT based interactive wayfinding for visually impaired users in subways. Journal on Technology & Persons with Disabilities 3, 22 (2015).Google ScholarGoogle Scholar
  20. Aura Ganz, James M Schafer, Yang Tao, Carole Wilson, and Meg Robertson. 2014. PERCEPT-II: Smartphone based indoor navigation system for the blind. In Engineering in Medicine and Biology Society (EMBC), 2014 36th Annual International Conference of the IEEE. IEEE, 3662--3665.Google ScholarGoogle ScholarCross RefCross Ref
  21. Jianjun Gui, Dongbing Gu, Sen Wang, and Huosheng Hu. 2015. A review of visual inertial odometry from filtering and optimisation perspectives. Advanced Robotics 29, 20 (2015), 1289--1301.Google ScholarGoogle ScholarCross RefCross Ref
  22. R. I. Hartley and A. Zisserman. 2004. Multiple View Geometry in Computer Vision (second ed.). Cambridge University Press, ISBN: 0521540518.Google ScholarGoogle Scholar
  23. Bill Holton. 2015. A Review of the Be My Eyes Remote Sighted Helper App for Apple iOS. Access World Magazine 16, 2 (2015).Google ScholarGoogle Scholar
  24. Apple Inc. 2018. ARKit Apple Developer. https://developer.apple.com/arkit/. (2018).Google ScholarGoogle Scholar
  25. Tatsuya Ishihara, Jayakorn Vongkulbhisal, Kris M Kitani, and Chieko Asakawa. 2017. Beacon-Guided Structure from Motion for Smartphone-Based Navigation. In Applications of Computer Vision (WACV), 2017 IEEE Winter Conference on. IEEE, 769--777.Google ScholarGoogle ScholarCross RefCross Ref
  26. Corinne Kirchner, Emilie Schmeidler, and Alexander Todorov. 1999. Looking at Employment through a Lifespan Telescope: Age, Health, and Employment Status of People with Serious Visual Impairment. Mississippi State, MS: Rehabilitation Research and Training Center on Blindness and Low Vision. (1999).Google ScholarGoogle Scholar
  27. Gordon E Legge, Paul J Beckmann, Bosco S Tjan, Gary Havey, Kevin Kramer, David Rolkosky, Rachel Gage, Muzi Chen, Sravan Puchakayala, and Aravindhan Rangarajan. 2013. Indoor navigation by people with visual impairment using a digital sign system. PloS one 8, 10 (2013), e76783.Google ScholarGoogle ScholarCross RefCross Ref
  28. Robin Leonard, Tana D'Allura, and Amy Horowitz. 1999. Factors associated with employment among persons who have a vision impairment: A follow-up of vocational placement referrals. Journal of Vocational Rehabilitation 12, 1 (1999), 33--43.Google ScholarGoogle Scholar
  29. Stefan Leutenegger, Simon Lynen, Michael Bosse, Roland Siegwart, and Paul Furgale. 2015. Keyframe-based visual--inertial odometry using nonlinear optimization. The International Journal of Robotics Research 34, 3 (2015), 314--334.Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Mingyang Li and Anastasios I Mourikis. 2013. High-precision, consistent EKF-based visual-inertial odometry. The International Journal of Robotics Research 32, 6 (2013), 690--711.Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Google LLC. 2018. ARCore Overview. https://developers.google.com/ar/discover/. (2018).Google ScholarGoogle Scholar
  32. Richard G Long and EW Hill. 1997. Establishing and maintaining orientation for mobility. Foundations of orientation and mobility 1 (1997).Google ScholarGoogle Scholar
  33. Mashable. 2019. What it's like to walk with Google Maps in augmented reality. https://mashable.com/article/google-maps-ar-augmented-reality-walking-navigation/. (2019).Google ScholarGoogle Scholar
  34. John Morris and James Mueller. 2014. Blind and deaf consumer preferences for android and iOS smartphones. In Inclusive designing. Springer, 69--79.Google ScholarGoogle Scholar
  35. Amal Nanavati, Xiang Zhi Tan, and Aaron Steinfeld. 2018. Coupled Indoor Navigation for People Who Are Blind. In Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (HRI '18). ACM, New York, NY, USA, 201--202. http://dx.doi.org/10.1145/3173386.3176976Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Bonnie O'Day. 1999. Employment Barriers for People with Visual Impairments. Journal of Visual Impairment & Blindness 93, 10 (1999).Google ScholarGoogle ScholarCross RefCross Ref
  37. Timothy H Riehle, Shane M Anderson, Patrick A Lichter, William E Whalen, and Nicholas A Giudice. 2013. Indoor inertial waypoint navigation for the blind. In Engineering in Medicine and Biology Society (EMBC), 2013 35th Annual International Conference of the IEEE. IEEE, 5187--5190.Google ScholarGoogle ScholarCross RefCross Ref
  38. Victor R Schinazi, Tyler Thrash, and Daniel-Robert Chebat. 2016. Spatial navigation by congenitally blind individuals. Wiley Interdisciplinary Reviews: Cognitive Science 7, 1 (2016), 37--58.Google ScholarGoogle ScholarCross RefCross Ref
  39. Catherine Thinus-Blanc and Florence Gaunet. 1997. Representation of space in blind persons: vision as a spatial sense? Psychological bulletin 121, 1 (1997), 20.Google ScholarGoogle Scholar
  40. William R Wiener, Richard L Welsh, and Bruce B Blasch. 2010. Foundations of orientation and mobility. Vol. 1. American Foundation for the Blind.Google ScholarGoogle Scholar
  41. Michele A Williams, Caroline Galbraith, Shaun K Kane, and Amy Hurst. 2014. just let the cane hit it: how the blind and sighted see navigation differently. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility. ACM, 217--224.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Leveraging Augmented Reality to Create Apps for People with Visual Disabilities: A Case Study in Indoor Navigation

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in
            • Published in

              cover image ACM Conferences
              ASSETS '19: Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility
              October 2019
              730 pages
              ISBN:9781450366762
              DOI:10.1145/3308561

              Copyright © 2019 ACM

              Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 24 October 2019

              Permissions

              Request permissions about this article.

              Request Permissions

              Check for updates

              Qualifiers

              • research-article

              Acceptance Rates

              ASSETS '19 Paper Acceptance Rate41of158submissions,26%Overall Acceptance Rate436of1,556submissions,28%

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader