skip to main content
research-article

The Tangible Desktop: A Multimodal Approach to Nonvisual Computing

Published:11 August 2017Publication History
Skip Abstract Section

Abstract

Audio-only interfaces, facilitated through text-to-speech screen reading software, have been the primary mode of computer interaction for blind and low-vision computer users for more than four decades. During this time, the advances that have made visual interfaces faster and easier to use, from direct manipulation to skeuomorphic design, have not been paralleled in nonvisual computing environments. The screen reader–dependent community is left with no alternatives to engage with our rapidly advancing technological infrastructure. In this article, we describe our efforts to understand the problems that exist with audio-only interfaces. Based on observing screen reader use for 4 months at a computer training school for blind and low-vision adults, we identify three problem areas within audio-only interfaces: ephemerality, linear interaction, and unidirectional communication. We then evaluated a multimodal approach to computer interaction called the Tangible Desktop that addresses these problems by moving semantic information from the auditory to the tactile channel. Our evaluation demonstrated that among novice screen reader users, Tangible Desktop improved task completion times by an average of 6 minutes when compared to traditional audio-only computer systems.

References

  1. G. Abner and E. Lahm. 2002. Implementation of assistive technology with students who are visually impaired: Teachers’ readiness. J. Vis. Impair. Blind. 96, 02 (200Google ScholarGoogle Scholar
  2. Afb.org. 2015. Statistical Snapshots from the American Foundation for the Blind. Retrieved from http://www.afb.org/info/blindness-statistics/2.Google ScholarGoogle Scholar
  3. Jussi Ängeslevä, Ian Oakley, Stephen Hughes, and Sile OModhrain. 2003. Body mnemonics portable device interaction design concept. Proceedings of UIST 3 (2003), 2–5.Google ScholarGoogle Scholar
  4. C. Asakawa, H. Takagi, S. Ino, and T. Ifukube. 2002. Auditory and tactile interfaces for representing the visual effects on the web. In Proceedings of the Fifth International ACM Conference on Assistive Technologies. ACM, 65--72. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. O. Bau, Poupyrev, A. Israr, and Harrison. 2010. TeslaTouch: Electrovibration for touch surfaces. In Proceedings of ACM Symposium on User Interface Software and Technology (UIST’10). 283–292. DOI:http://dx.doi.org/10.1145/1866029.1866074 Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Meera M. Blattner, Denise A. Sumikawa, and Robert M. Greenberg. 1989. Earcons and icons: Their structure and common design principles. Hum.–Comput. Interact. 4, 1 (1989), 11–44.Google ScholarGoogle Scholar
  7. Yevgen Borodin, Jeffrey P. Bigham, Glenn Dausch, and I. V. Ramakrishnan. 2010. More than meets the eye: A survey of screen-reader browsing strategies. In Proceedings of the 2010 International Cross Disciplinary Conference on Web Accessibility (W4A’10). ACM, New York, NY., Article 13, 10 pages. DOI:http://dx.doi.org/10.1145/1805986.1806005 Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Stephen Brewster and Lorna M. Brown. 2004. Tactons: Structured tactile messages for non-visual information display. In Proceedings of the Australasian User Interface Conference (AUIC’04). Australian Computer Society, Inc., Darlinghurst, Australia, 15–23.Google ScholarGoogle Scholar
  9. L. M. Brown, S. A. Brewster, and H. C. Purchase. 2005. A first investigation into the effectiveness of tactons. In Proceedings of the 1st Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. IEEE, 167–176. DOI:http://dx.doi.org/10.1109/WHC.2005.6 Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Laura Dabbish, Gloria Mark, and Víctor M. González. 2011. Why do i keep interrupting myself?: Environment, habit and self-interruption. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 3127–3130. DOI:http://dx.doi.org/10.1145/1978942.1979405 Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Hilko Donker, Palle Klante, and Peter Gorny. 2002. The design of auditory user interfaces for blind users. In Proceedings of the 2nd Nordic Conference on Human-Computer Interaction. 149–155. DOI:http://dx.doi.org/10.1145/572020.572038 Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. W. Keith Edwards, Elizabeth D. Mynatt, and Kathryn Stockton. 1994. Providing access to graphical user interfaces–not graphical screens. In Proceedings of the First Annual ACM Conference on Assistive Technologies. ACM, 47–54. http://dl.acm.org/citation.cfm?id=191041, https://smartech.gatech.edu/bitstream/handle/1853/3579/95-35.pdf.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Eelke Folmer and Tony Morelli. 2012. Spatial gestures using a tactile-proprioceptive display. In Proceedings of the 6th International Conference on Tangible, Embedded and Embodied Interaction (TEI’12). ACM, New York, NY, 139–142. DOI:http://dx.doi.org/10.1145/2148131.2148161 Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. E. Foulke and T. G. Sticht. 1969. Review of research on the intelligibility and comprehension of accelerated speech. Psychol. Bull. 72, 1 (1969), 50. Google ScholarGoogle ScholarCross RefCross Ref
  15. Victor M. González and Gloria Mark. 2004. “Constant, constant, multi-tasking craziness”: Managing multiple working spheres. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 113–120. DOI:http://dx.doi.org/10.1145/985692.985707 Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Sidhant Gupta, Dan Morris, Shwetak N. Patel, and Desney Tan. 2013. AirWave: Non-contact haptic feedback using air vortex rings. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP’13). 419–428. DOI:http://dx.doi.org/10.1145/2493432.2493463 Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. James Hollan, Edwin Hutchins, and David Kirsh. 2000. Distributed cognition: Toward a new foundation for human-computer interaction research. ACM Trans. Comput.-Hum. Interact. 7, 2 (2000), 174–196.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Amy Hurst and Jasmine Tobias. 2011. Empowering individuals with do-it-yourself assistive technology. In Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS’11). ACM, New York, NY, 11–18. DOI:http://dx.doi.org/10.1145/2049536.2049541 Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Takeo Igarashi, Tomer Moscovich, and John F. Hughes. 2005. As-rigid-as-possible shape manipulation. In Proceedings of the ACM International Conference on Computer Graphics and Interactive Techniques (SIGGRAPH’05). ACM, New York, NY, 1134–1141. DOI:http://dx.doi.org/10.1145/1186822.1073323 Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Hiroshi Ishii and Brygg Ullmer. 1997. Tangible bits: Towards seamless interfaces between people, bits and atoms. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI’97). ACM, New York, NY, 234–241. DOI:http://dx.doi.org/10.1145/258549.258715 Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Roberta L. Klatzky and Susan J. Lederman. 2003. Touch. Handbook of Psychology.Google ScholarGoogle Scholar
  22. Susanne Klein, Guy Adams, Fraser Dickin, and Steve Simske. 2013. 3D Printing: When and Where Does It Make Sense?, Vol. 2013. Society for Imaging Science and Technology, 5–8.Google ScholarGoogle Scholar
  23. Ravi Kuber, Wai Yu, and M. Sile O’Modhrain. 2010. Tactile web browsing for blind users. International Workshop on Haptic and Audio Interaction Design. Springer, 75--84. Google ScholarGoogle ScholarCross RefCross Ref
  24. Stacey Kuznetsov, Anind K. Dey, and Scott E. Hudson. 2009. The effectiveness of haptic cues as an assistive technology for human memory. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Vol. 5538. 168–175. DOI:http://dx.doi.org/10.1007/978-3-642-01516-8_12 Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Jonathan Lazar, Aaron Allen, Jason Kleinman, and Chris Malarkey. 2007. What frustrates screen reader users on the web: A study of 100 blind users. Int. J. Hum.-Comput. Interact. 22, 3 (May 2007), 247–269. DOI:http://dx.doi.org/10.1080/10447310709336964 Google ScholarGoogle ScholarCross RefCross Ref
  26. Jennifer Mankoff, Holly Fait, and Tu Tran. 2005. Is your web page accessible?: A comparative study of methods for assessing web page accessibility for the blind. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’05). ACM, New York, NY, 41–50. DOI:http://dx.doi.org/10.1145/1054972.1054979 Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Gloria Mark, Shamsi T. Iqbal, Mary Czerwinski, and Paul Johns. 2014. Bored mondays and focused afternoons: The rhythm of attention and online activity in the workplace. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 3025–3034. DOI:http://dx.doi.org/10.1145/2556288.2557204 Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Elizabeth Mynatt. 1997. Transforming graphical interfaces into auditory interfaces for blind users. Hum.-Comput. Interact. 12, 1 (1997), 7–45. DOI:http://dx.doi.org/10.1207/s15327051hci120182_2Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Elizabeth D. Mynatt and Gerhard Weber. 1994. Nonvisual presentation of graphical user interfaces: Contrasting two approaches. In Proceedings of the SIGCHI Conference on Computer Human Interaction (CHI’94). ACM, New York, NY, 166–172. DOI:http://dx.doi.org/10.1145/191666.191732 Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. J Nielsen. 1994. Heuristic evaluation. Usabil. Inspect. Methods 17, 1 (1994), 25–62.Google ScholarGoogle Scholar
  31. Peter Plessers, S. Casteleyn, Yesilada Yesilada, O. D. Troyer, R. Stevens, S. Harper, and C. Goble. 2005. Accessibility: A web engineering approach. In Proceedings of the 14th International Conference on World Wide Web. ACM, 353–362. DOI:http://dx.doi.org/10.1145/1060745.1060799 Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. L. H. D. Poll and R. P. Waterham. 1995. Graphical user interfaces and visually disabled users. IEEE Trans. Rehabil. Eng. 3, 1 (Mar.1995), 65–69. DOI:http://dx.doi.org/10.1109/86.372894 Google ScholarGoogle ScholarCross RefCross Ref
  33. Denise Prescher, Gerhard Weber, and Martin Spindler. 2010. A tactile windowing system for blind users. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS’10). 91–98. DOI:http://dx.doi.org/10.1145/1878803.1878821 Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. T. V. Raman. 1996. Emacspeak—A speech interface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’96). ACM, New York, NY, 66–71. DOI:http://dx.doi.org/10.1145/238386.238405 Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Dan Ratanasit and Melody M. Moore. 2005. Representing graphical user interfaces with sound: A review of approaches. J. Vis. Impair. Blind. 99, 2 (2005), 69.Google ScholarGoogle ScholarCross RefCross Ref
  36. John T. Richards, Kyle Montague, and Vicki L. Hanson. 2012. Web accessibility as a side effect. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS’12). ACM, New York, NY, 79–86. DOI:http://dx.doi.org/10.1145/2384916.2384931 Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Brigitte Röder, Frank Rösler, and Helen J. Neville. 2001. Auditory memory in congenitally blind adults: A behavioral-electrophysiological investigation. Cogn. Brain Res. 11, 2 (2001), 289–303. DOI:http://dx.doi.org/10.1016/S0926-6410(01)00002-7 Google ScholarGoogle ScholarCross RefCross Ref
  38. W. Sapp and P. Hatlen. 2010. The expanded core curriculum: Where we have been, where we are going, and how we can get there. J. Vis. Impair. Blind. 104, 6 (2010), 338.Google ScholarGoogle ScholarCross RefCross Ref
  39. Valkyrie Savage, Xiaohan Zhang, and Björn Hartmann. 2012. Midas: Fabricating custom capacitive touch sensors to prototype interactive objects. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST’12). ACM, New York, NY, 579–588. DOI:http://dx.doi.org/10.1145/2380116.2380189 Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Anthony Savidis, Constantine Stephanidis, Andreas Korte, Kai Crispien, and Klaus Fellbaum. 1996. A generic direct-manipulation 3d-auditory environment for hierarchical navigation in non-visual interaction. In Proceedings of the 2nd Annual ACM Conference on Assistive Technologies (ASSETS’96). ACM, New York, NY, 117–123. DOI:http://dx.doi.org/10.1145/228347.228366 Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Chris Schmandt. 1998. Audio hallway: A virtual acoustic environment for browsing. In Proceedings of the 11th Annual ACM Symposium on User Interface Software and Technology. ACM, 163–170. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Bertrand Schneider, James M. Wallace, Paulo Blikstein, and Roy Pea. 2013. Preparing for future learning with a tangible user interface: The case of neuroscience. IEEE Trans. Learn. Technol. 6, 2 (2013), 117–129.Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Section508.gov. 1998. Section 508 Law and Related Laws and Policies. Retrieved from https://www.section508.gov/content/learn/laws-and-policies.Google ScholarGoogle Scholar
  44. Orit Shaer and Eva Hornecker. 2010. Tangible user interfaces: Past, present, and future directions. Found. Trends Hum.-Comput. Interact. 3, 12 (2010), 1–137. DOI:http://dx.doi.org/10.1561/1100000026 Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. R. Sodhi, Poupyrev, M. Glisson, and A. Israr. 2013. AIREAL: Interactive tactile experiences in free air. ACM Trans. Graph. 32, 4 (2013), 134. DOI:http://dx.doi.org/10.1145/2461912.2462007 Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Markel Vigo and Simon Harper. 2013. Challenging information foraging theory: Screen reader users are not always driven by information scent. In Proceedings of the 24th ACM Conference on Hypertext and Social Media (HT’13). ACM, New York, NY, 60–68. DOI:http://dx.doi.org/10.1145/2481492.2481499 Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Markel Vigo and Simon Harper. 2014. A snapshot of the first encounters of visually disabled users with the Web. Comput. Hum. Behav. 34 (2014), 203–212. DOI:http://dx.doi.org/10.1016/j.chb.2014.01.045 Google ScholarGoogle ScholarCross RefCross Ref
  48. Holly S. Vitense, Julie A. Jacko, and V. Kathlene Emery. 2002. Multimodal feedback: Establishing a performance baseline for improved access by individuals with visual impairments. In Proceedings of the 5th International ACM Conference on Assistive Technologies. ACM, 49–56.Google ScholarGoogle Scholar
  49. W3C. 2014. Web Accessibility Initiative (WAI). Retrieved from http://www.w3.org/WAI/.Google ScholarGoogle Scholar
  50. W3C. 2014. Web Content Accessibility Guidelines (WCAG). Retrieved from https://www.w3.org/WAI/intro/wcag.Google ScholarGoogle Scholar
  51. Webaim.org. 2016. Web Accessibility in Mind. Retrieved from http://webaim.org/.Google ScholarGoogle Scholar
  52. Malte Weiss, Simon Voelker, Jan Borchers, and Chat Wacharamanotham. 2011. FingerFlux: Near-surface haptic feedback on tabletops. In Proceedings of the 24th Symposium on User Interface Software and Technology (UIST’11). 615–620. DOI:http://dx.doi.org/10.1145/2047196.2047277 Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. Malte Weiss, Julie Wagner, Yvonne Jansen, Roger Jennings, Ramsin Khoshabeh, James D. Hollan, and Jan Borchers. 2009. SLAP widgets: Bridging the gap between virtual and physical controls on tabletops. In Proceedings of the SIGCHI Conference on Computer Human Interaction (CHI’09). ACM, New York, NY, 481–490. DOI:http://dx.doi.org/10.1145/1518701.1518779 Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. Brian Wentz, Harry Hochheiser, and Jonathan Lazar. 2013. A survey of blind users on the usability of email applications. Univers. Access Inf. Soc. 12, 3 (Aug. 2013), 327–336. DOI:http://dx.doi.org/10.1007/s10209-012-0285-9 Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. Cheng Xu, Ali Israr, Ivan Poupyrev, Olivier Bau, and Chris Harrison. 2011. Tactile display for the visually impaired using TeslaTouch. In Proceedings of the 2011 Annual Conference Extended Abstracts on Human Factors in Computing Systems (CHIEA’11). ACM Press, New York, New York, 317. DOI:http://dx.doi.org/10.1145/1979742.1979705 Google ScholarGoogle ScholarDigital LibraryDigital Library
  56. Yeliz Yesilada, Simon Harper, Carole Goble, and Robert Stevens. 2004. Screen Readers Cannot See. Springer, 445–458.Google ScholarGoogle Scholar
  57. Jiajie Zhang and Vimla L. Patel. 2006. Distributed cognition, representation, and affordance. Pragmat. Cogn. 14, 2 (2006), 333–341.Google ScholarGoogle ScholarCross RefCross Ref
  58. Li Zhou, Amy T. Parker, Derrick W. Smith, and Nora Griffin-Shirley. 2011. Assistive technology for students with visual impairments: Challenges and needs in teachers’ preparation programs and practice. J. Vis. Impair. Blind. 105, 4 (2011), 197.Google ScholarGoogle ScholarCross RefCross Ref
  59. Oren Zuckerman and Ayelet Gal-Oz. 2013. To TUI or not to TUI: Evaluating performance and preference in tangible vs. graphical user interfaces. Int. J. Hum. Comput. Stud. 71, 7--8 (2013), 803–820. DOI:http://dx.doi.org/10.1016/j.ijhcs.2013.04.003 Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. The Tangible Desktop: A Multimodal Approach to Nonvisual Computing

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in

            Full Access

            • Published in

              cover image ACM Transactions on Accessible Computing
              ACM Transactions on Accessible Computing  Volume 10, Issue 3
              August 2017
              76 pages
              ISSN:1936-7228
              EISSN:1936-7236
              DOI:10.1145/3132048
              Issue’s Table of Contents

              Copyright © 2017 ACM

              Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 11 August 2017
              • Revised: 1 March 2017
              • Accepted: 1 March 2017
              • Received: 1 April 2016
              Published in taccess Volume 10, Issue 3

              Permissions

              Request permissions about this article.

              Request Permissions

              Check for updates

              Qualifiers

              • research-article
              • Research
              • Refereed

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader