Abstract
Audio-only interfaces, facilitated through text-to-speech screen reading software, have been the primary mode of computer interaction for blind and low-vision computer users for more than four decades. During this time, the advances that have made visual interfaces faster and easier to use, from direct manipulation to skeuomorphic design, have not been paralleled in nonvisual computing environments. The screen reader–dependent community is left with no alternatives to engage with our rapidly advancing technological infrastructure. In this article, we describe our efforts to understand the problems that exist with audio-only interfaces. Based on observing screen reader use for 4 months at a computer training school for blind and low-vision adults, we identify three problem areas within audio-only interfaces: ephemerality, linear interaction, and unidirectional communication. We then evaluated a multimodal approach to computer interaction called the Tangible Desktop that addresses these problems by moving semantic information from the auditory to the tactile channel. Our evaluation demonstrated that among novice screen reader users, Tangible Desktop improved task completion times by an average of 6 minutes when compared to traditional audio-only computer systems.
- G. Abner and E. Lahm. 2002. Implementation of assistive technology with students who are visually impaired: Teachers’ readiness. J. Vis. Impair. Blind. 96, 02 (200Google Scholar
- Afb.org. 2015. Statistical Snapshots from the American Foundation for the Blind. Retrieved from http://www.afb.org/info/blindness-statistics/2.Google Scholar
- Jussi Ängeslevä, Ian Oakley, Stephen Hughes, and Sile OModhrain. 2003. Body mnemonics portable device interaction design concept. Proceedings of UIST 3 (2003), 2–5.Google Scholar
- C. Asakawa, H. Takagi, S. Ino, and T. Ifukube. 2002. Auditory and tactile interfaces for representing the visual effects on the web. In Proceedings of the Fifth International ACM Conference on Assistive Technologies. ACM, 65--72. Google ScholarDigital Library
- O. Bau, Poupyrev, A. Israr, and Harrison. 2010. TeslaTouch: Electrovibration for touch surfaces. In Proceedings of ACM Symposium on User Interface Software and Technology (UIST’10). 283–292. DOI:http://dx.doi.org/10.1145/1866029.1866074 Google ScholarDigital Library
- Meera M. Blattner, Denise A. Sumikawa, and Robert M. Greenberg. 1989. Earcons and icons: Their structure and common design principles. Hum.–Comput. Interact. 4, 1 (1989), 11–44.Google Scholar
- Yevgen Borodin, Jeffrey P. Bigham, Glenn Dausch, and I. V. Ramakrishnan. 2010. More than meets the eye: A survey of screen-reader browsing strategies. In Proceedings of the 2010 International Cross Disciplinary Conference on Web Accessibility (W4A’10). ACM, New York, NY., Article 13, 10 pages. DOI:http://dx.doi.org/10.1145/1805986.1806005 Google ScholarDigital Library
- Stephen Brewster and Lorna M. Brown. 2004. Tactons: Structured tactile messages for non-visual information display. In Proceedings of the Australasian User Interface Conference (AUIC’04). Australian Computer Society, Inc., Darlinghurst, Australia, 15–23.Google Scholar
- L. M. Brown, S. A. Brewster, and H. C. Purchase. 2005. A first investigation into the effectiveness of tactons. In Proceedings of the 1st Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. IEEE, 167–176. DOI:http://dx.doi.org/10.1109/WHC.2005.6 Google ScholarDigital Library
- Laura Dabbish, Gloria Mark, and Víctor M. González. 2011. Why do i keep interrupting myself?: Environment, habit and self-interruption. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 3127–3130. DOI:http://dx.doi.org/10.1145/1978942.1979405 Google ScholarDigital Library
- Hilko Donker, Palle Klante, and Peter Gorny. 2002. The design of auditory user interfaces for blind users. In Proceedings of the 2nd Nordic Conference on Human-Computer Interaction. 149–155. DOI:http://dx.doi.org/10.1145/572020.572038 Google ScholarDigital Library
- W. Keith Edwards, Elizabeth D. Mynatt, and Kathryn Stockton. 1994. Providing access to graphical user interfaces–not graphical screens. In Proceedings of the First Annual ACM Conference on Assistive Technologies. ACM, 47–54. http://dl.acm.org/citation.cfm?id=191041, https://smartech.gatech.edu/bitstream/handle/1853/3579/95-35.pdf.Google ScholarDigital Library
- Eelke Folmer and Tony Morelli. 2012. Spatial gestures using a tactile-proprioceptive display. In Proceedings of the 6th International Conference on Tangible, Embedded and Embodied Interaction (TEI’12). ACM, New York, NY, 139–142. DOI:http://dx.doi.org/10.1145/2148131.2148161 Google ScholarDigital Library
- E. Foulke and T. G. Sticht. 1969. Review of research on the intelligibility and comprehension of accelerated speech. Psychol. Bull. 72, 1 (1969), 50. Google ScholarCross Ref
- Victor M. González and Gloria Mark. 2004. “Constant, constant, multi-tasking craziness”: Managing multiple working spheres. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 113–120. DOI:http://dx.doi.org/10.1145/985692.985707 Google ScholarDigital Library
- Sidhant Gupta, Dan Morris, Shwetak N. Patel, and Desney Tan. 2013. AirWave: Non-contact haptic feedback using air vortex rings. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP’13). 419–428. DOI:http://dx.doi.org/10.1145/2493432.2493463 Google ScholarDigital Library
- James Hollan, Edwin Hutchins, and David Kirsh. 2000. Distributed cognition: Toward a new foundation for human-computer interaction research. ACM Trans. Comput.-Hum. Interact. 7, 2 (2000), 174–196.Google ScholarDigital Library
- Amy Hurst and Jasmine Tobias. 2011. Empowering individuals with do-it-yourself assistive technology. In Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS’11). ACM, New York, NY, 11–18. DOI:http://dx.doi.org/10.1145/2049536.2049541 Google ScholarDigital Library
- Takeo Igarashi, Tomer Moscovich, and John F. Hughes. 2005. As-rigid-as-possible shape manipulation. In Proceedings of the ACM International Conference on Computer Graphics and Interactive Techniques (SIGGRAPH’05). ACM, New York, NY, 1134–1141. DOI:http://dx.doi.org/10.1145/1186822.1073323 Google ScholarDigital Library
- Hiroshi Ishii and Brygg Ullmer. 1997. Tangible bits: Towards seamless interfaces between people, bits and atoms. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI’97). ACM, New York, NY, 234–241. DOI:http://dx.doi.org/10.1145/258549.258715 Google ScholarDigital Library
- Roberta L. Klatzky and Susan J. Lederman. 2003. Touch. Handbook of Psychology.Google Scholar
- Susanne Klein, Guy Adams, Fraser Dickin, and Steve Simske. 2013. 3D Printing: When and Where Does It Make Sense?, Vol. 2013. Society for Imaging Science and Technology, 5–8.Google Scholar
- Ravi Kuber, Wai Yu, and M. Sile O’Modhrain. 2010. Tactile web browsing for blind users. International Workshop on Haptic and Audio Interaction Design. Springer, 75--84. Google ScholarCross Ref
- Stacey Kuznetsov, Anind K. Dey, and Scott E. Hudson. 2009. The effectiveness of haptic cues as an assistive technology for human memory. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Vol. 5538. 168–175. DOI:http://dx.doi.org/10.1007/978-3-642-01516-8_12 Google ScholarDigital Library
- Jonathan Lazar, Aaron Allen, Jason Kleinman, and Chris Malarkey. 2007. What frustrates screen reader users on the web: A study of 100 blind users. Int. J. Hum.-Comput. Interact. 22, 3 (May 2007), 247–269. DOI:http://dx.doi.org/10.1080/10447310709336964 Google ScholarCross Ref
- Jennifer Mankoff, Holly Fait, and Tu Tran. 2005. Is your web page accessible?: A comparative study of methods for assessing web page accessibility for the blind. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’05). ACM, New York, NY, 41–50. DOI:http://dx.doi.org/10.1145/1054972.1054979 Google ScholarDigital Library
- Gloria Mark, Shamsi T. Iqbal, Mary Czerwinski, and Paul Johns. 2014. Bored mondays and focused afternoons: The rhythm of attention and online activity in the workplace. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 3025–3034. DOI:http://dx.doi.org/10.1145/2556288.2557204 Google ScholarDigital Library
- Elizabeth Mynatt. 1997. Transforming graphical interfaces into auditory interfaces for blind users. Hum.-Comput. Interact. 12, 1 (1997), 7–45. DOI:http://dx.doi.org/10.1207/s15327051hci120182_2Google ScholarDigital Library
- Elizabeth D. Mynatt and Gerhard Weber. 1994. Nonvisual presentation of graphical user interfaces: Contrasting two approaches. In Proceedings of the SIGCHI Conference on Computer Human Interaction (CHI’94). ACM, New York, NY, 166–172. DOI:http://dx.doi.org/10.1145/191666.191732 Google ScholarDigital Library
- J Nielsen. 1994. Heuristic evaluation. Usabil. Inspect. Methods 17, 1 (1994), 25–62.Google Scholar
- Peter Plessers, S. Casteleyn, Yesilada Yesilada, O. D. Troyer, R. Stevens, S. Harper, and C. Goble. 2005. Accessibility: A web engineering approach. In Proceedings of the 14th International Conference on World Wide Web. ACM, 353–362. DOI:http://dx.doi.org/10.1145/1060745.1060799 Google ScholarDigital Library
- L. H. D. Poll and R. P. Waterham. 1995. Graphical user interfaces and visually disabled users. IEEE Trans. Rehabil. Eng. 3, 1 (Mar.1995), 65–69. DOI:http://dx.doi.org/10.1109/86.372894 Google ScholarCross Ref
- Denise Prescher, Gerhard Weber, and Martin Spindler. 2010. A tactile windowing system for blind users. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS’10). 91–98. DOI:http://dx.doi.org/10.1145/1878803.1878821 Google ScholarDigital Library
- T. V. Raman. 1996. Emacspeak—A speech interface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’96). ACM, New York, NY, 66–71. DOI:http://dx.doi.org/10.1145/238386.238405 Google ScholarDigital Library
- Dan Ratanasit and Melody M. Moore. 2005. Representing graphical user interfaces with sound: A review of approaches. J. Vis. Impair. Blind. 99, 2 (2005), 69.Google ScholarCross Ref
- John T. Richards, Kyle Montague, and Vicki L. Hanson. 2012. Web accessibility as a side effect. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS’12). ACM, New York, NY, 79–86. DOI:http://dx.doi.org/10.1145/2384916.2384931 Google ScholarDigital Library
- Brigitte Röder, Frank Rösler, and Helen J. Neville. 2001. Auditory memory in congenitally blind adults: A behavioral-electrophysiological investigation. Cogn. Brain Res. 11, 2 (2001), 289–303. DOI:http://dx.doi.org/10.1016/S0926-6410(01)00002-7 Google ScholarCross Ref
- W. Sapp and P. Hatlen. 2010. The expanded core curriculum: Where we have been, where we are going, and how we can get there. J. Vis. Impair. Blind. 104, 6 (2010), 338.Google ScholarCross Ref
- Valkyrie Savage, Xiaohan Zhang, and Björn Hartmann. 2012. Midas: Fabricating custom capacitive touch sensors to prototype interactive objects. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST’12). ACM, New York, NY, 579–588. DOI:http://dx.doi.org/10.1145/2380116.2380189 Google ScholarDigital Library
- Anthony Savidis, Constantine Stephanidis, Andreas Korte, Kai Crispien, and Klaus Fellbaum. 1996. A generic direct-manipulation 3d-auditory environment for hierarchical navigation in non-visual interaction. In Proceedings of the 2nd Annual ACM Conference on Assistive Technologies (ASSETS’96). ACM, New York, NY, 117–123. DOI:http://dx.doi.org/10.1145/228347.228366 Google ScholarDigital Library
- Chris Schmandt. 1998. Audio hallway: A virtual acoustic environment for browsing. In Proceedings of the 11th Annual ACM Symposium on User Interface Software and Technology. ACM, 163–170. Google ScholarDigital Library
- Bertrand Schneider, James M. Wallace, Paulo Blikstein, and Roy Pea. 2013. Preparing for future learning with a tangible user interface: The case of neuroscience. IEEE Trans. Learn. Technol. 6, 2 (2013), 117–129.Google ScholarDigital Library
- Section508.gov. 1998. Section 508 Law and Related Laws and Policies. Retrieved from https://www.section508.gov/content/learn/laws-and-policies.Google Scholar
- Orit Shaer and Eva Hornecker. 2010. Tangible user interfaces: Past, present, and future directions. Found. Trends Hum.-Comput. Interact. 3, 12 (2010), 1–137. DOI:http://dx.doi.org/10.1561/1100000026 Google ScholarDigital Library
- R. Sodhi, Poupyrev, M. Glisson, and A. Israr. 2013. AIREAL: Interactive tactile experiences in free air. ACM Trans. Graph. 32, 4 (2013), 134. DOI:http://dx.doi.org/10.1145/2461912.2462007 Google ScholarDigital Library
- Markel Vigo and Simon Harper. 2013. Challenging information foraging theory: Screen reader users are not always driven by information scent. In Proceedings of the 24th ACM Conference on Hypertext and Social Media (HT’13). ACM, New York, NY, 60–68. DOI:http://dx.doi.org/10.1145/2481492.2481499 Google ScholarDigital Library
- Markel Vigo and Simon Harper. 2014. A snapshot of the first encounters of visually disabled users with the Web. Comput. Hum. Behav. 34 (2014), 203–212. DOI:http://dx.doi.org/10.1016/j.chb.2014.01.045 Google ScholarCross Ref
- Holly S. Vitense, Julie A. Jacko, and V. Kathlene Emery. 2002. Multimodal feedback: Establishing a performance baseline for improved access by individuals with visual impairments. In Proceedings of the 5th International ACM Conference on Assistive Technologies. ACM, 49–56.Google Scholar
- W3C. 2014. Web Accessibility Initiative (WAI). Retrieved from http://www.w3.org/WAI/.Google Scholar
- W3C. 2014. Web Content Accessibility Guidelines (WCAG). Retrieved from https://www.w3.org/WAI/intro/wcag.Google Scholar
- Webaim.org. 2016. Web Accessibility in Mind. Retrieved from http://webaim.org/.Google Scholar
- Malte Weiss, Simon Voelker, Jan Borchers, and Chat Wacharamanotham. 2011. FingerFlux: Near-surface haptic feedback on tabletops. In Proceedings of the 24th Symposium on User Interface Software and Technology (UIST’11). 615–620. DOI:http://dx.doi.org/10.1145/2047196.2047277 Google ScholarDigital Library
- Malte Weiss, Julie Wagner, Yvonne Jansen, Roger Jennings, Ramsin Khoshabeh, James D. Hollan, and Jan Borchers. 2009. SLAP widgets: Bridging the gap between virtual and physical controls on tabletops. In Proceedings of the SIGCHI Conference on Computer Human Interaction (CHI’09). ACM, New York, NY, 481–490. DOI:http://dx.doi.org/10.1145/1518701.1518779 Google ScholarDigital Library
- Brian Wentz, Harry Hochheiser, and Jonathan Lazar. 2013. A survey of blind users on the usability of email applications. Univers. Access Inf. Soc. 12, 3 (Aug. 2013), 327–336. DOI:http://dx.doi.org/10.1007/s10209-012-0285-9 Google ScholarDigital Library
- Cheng Xu, Ali Israr, Ivan Poupyrev, Olivier Bau, and Chris Harrison. 2011. Tactile display for the visually impaired using TeslaTouch. In Proceedings of the 2011 Annual Conference Extended Abstracts on Human Factors in Computing Systems (CHIEA’11). ACM Press, New York, New York, 317. DOI:http://dx.doi.org/10.1145/1979742.1979705 Google ScholarDigital Library
- Yeliz Yesilada, Simon Harper, Carole Goble, and Robert Stevens. 2004. Screen Readers Cannot See. Springer, 445–458.Google Scholar
- Jiajie Zhang and Vimla L. Patel. 2006. Distributed cognition, representation, and affordance. Pragmat. Cogn. 14, 2 (2006), 333–341.Google ScholarCross Ref
- Li Zhou, Amy T. Parker, Derrick W. Smith, and Nora Griffin-Shirley. 2011. Assistive technology for students with visual impairments: Challenges and needs in teachers’ preparation programs and practice. J. Vis. Impair. Blind. 105, 4 (2011), 197.Google ScholarCross Ref
- Oren Zuckerman and Ayelet Gal-Oz. 2013. To TUI or not to TUI: Evaluating performance and preference in tangible vs. graphical user interfaces. Int. J. Hum. Comput. Stud. 71, 7--8 (2013), 803–820. DOI:http://dx.doi.org/10.1016/j.ijhcs.2013.04.003 Google ScholarDigital Library
Index Terms
- The Tangible Desktop: A Multimodal Approach to Nonvisual Computing
Recommendations
Beyond audition: tangible alternatives for nonvisual computer interaction
UbiComp '17: Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable ComputersToday's model for computer interaction, one built around the coordinated use of mouse, keyboard, and graphical display, has been optimized for visual interaction since its inception nearly forty years ago. The transition from the textual interfaces of ...
Assisting people with visual impairments in aiming at a target on a large wall-mounted display
Large interactive displays have become ubiquitous in our everyday lives, but these displays are designed for the needs of sighted people. In this paper, we specifically address assisting people with visual impairments to aim at a target on a large wall-...
Touchplates: low-cost tactile overlays for visually impaired touch screen users
ASSETS '13: Proceedings of the 15th International ACM SIGACCESS Conference on Computers and AccessibilityAdding tactile feedback to touch screens can improve their accessibility to blind users, but prior approaches to integrating tactile feedback with touch screens have either offered limited functionality or required extensive (and typically expensive) ...
Comments