Abstract
The development of novel shape-changing or actuated tabletop tangible interfaces opens new perspectives for the design of physical and dynamic maps, especially for visually impaired (VI) users. Such maps would allow non-visual haptic exploration with advanced functions, such as panning and zooming. In this study, we designed an actuated tangible tabletop interface, called BotMap, allowing the exploration of geographic data through non-visual panning and zooming. In BotMap, small robots represent landmarks and move to their correct position whenever the map is refreshed. Users can interact with the robots to retrieve the names of the landmarks they represent. We designed two interfaces, named Keyboard and Sliders, which enable users to pan and zoom. Two evaluations were conducted with, respectively, ten blindfolded and eight VI participants. Results show that both interfaces were usable, with a slight advantage for the Keyboard interface in terms of navigation performance and map comprehension, and that, even when many panning and zooming operations were required, VI participants were able to understand the maps. Most participants managed to accurately reconstruct maps after exploration. Finally, we observed three VI people using the system and performing a classical task consisting in finding the more appropriate itinerary for a journey.
- Sandra Bardot, Marcos Serrano, and Christophe Jouffrais. 2016. From tactile to virtual: Using a smartwatch to improve spatial map exploration for visually impaired users. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI’16). ACM, 100--111. Google ScholarDigital Library
- SE Battersby. 2008. User-centered design for digital map navigation tools. In Proceedings of The 17th International Research Symposium on Computer-based Cartography. 1--15.Google Scholar
- Benjamin B. Bederson. 2011. The promise of zoomable user interfaces. Behav. Inform. Technol. 30, 6 (2011), 853--866.Google ScholarCross Ref
- Edward P. Berla and Lawrence H. Butterfield. 1977. Tactual distinctive features analysis: Training blind students in shape recognition and in locating shapes on a map. J. Special Edu. 11, 3 (1977), 335--346.Google ScholarCross Ref
- Anke M. Brock, Philippe Truillet, Bernard Oriola, Delphine Picard, and Christophe Jouffrais. 2015. Interactivity improves usability of geographic maps for visually impaired people. Hum.--Comput. Interact. 30 (2015), 156--194. Google ScholarDigital Library
- John Brooke. 1996. SUS: A “quick and dirty” usability scale. In Usability Evaluation in Industry, P. W. Jordan, B. Thomas, B. A. Weerdmeester, and I. L. McClelland (Eds.). Taylor 8 Francis, London, UK, 189--194.Google Scholar
- Geoff Cumming. 2013. The new statistics why and how. Psychol. Sci. 25 (2013), 7--29.Google ScholarCross Ref
- Franco Delogu, Massimiliano Palmiero, Stefano Federici, Catherine Plaisant, Haixia Zhao, and Olivetti Belardinelli. 2010. Non-visual exploration of geographic maps: Does sonification help? Disab. Rehab.: Assistive Technol. 5, 3 (2010), 164--174.Google ScholarCross Ref
- Pierre Dragicevic. 2015. HCI statistics without p-values. {Research Report} RR-8738, Inria, 32 pages.Google Scholar
- Julie Ducasse, Anke Brock, and Christophe Jouffrais. 2018. Accessible interactive maps for visually impaired users. In Mobility in Visually Impaired People -- Fundamentals and ICT Assistive Technologies, E. Pissaloux and R. Velasquez (Eds.). Springer, 537--584.Google Scholar
- Julie Ducasse, Marc J.-M. Macé, Marcos Serrano, and Christophe Jouffrais. 2016. Tangible reels: Construction and exploration of tangible maps by visually impaired users. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 2186--2197. Google ScholarDigital Library
- F. De Felice, F. Renna, G. Attolico, and A. Distante. 2007. A haptic/acoustic application to allow blind the access to spatial information. In Proceedings of the Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (WHC’07). IEEE, 310--315. Google ScholarDigital Library
- Martin J. Gardner and Douglas G. Altman. 1986. Confidence intervals rather than P values: Estimation rather than hypothesis testing. Br. Med. J. 292, 6522 (1986), 746--750.Google ScholarCross Ref
- Gary R. VandenBos (Ed.) APA. 2010. Manual of the American Psychological Association. (A. P. Association, Ed.).Google Scholar
- F. Gaunet, J. L. Martinez, and C. Thinus-Blanc. 1997. Early-blind subjects’ spatial representation of manipulatory space: Exploratory strategies and reaction to change. Perception 26, 3 (1997), 345--366.Google ScholarCross Ref
- Mathieu Le Goc, Lawrence H. Kim, Ali Parsaei, Jean-daniel Fekete, Pierre Dragicevic, and Sean Follmer. 2016. Zooids: Building blocks for swarm user interfaces. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST’16). ACM, 97--109. Google ScholarDigital Library
- Yves Guiard and Michel Beaudouin-Lafon. 2004. Target acquisition in multiscale electronic worlds. Int. J. Hum.-Comput. Stud. 61, 6 (2004), 875--905. Google ScholarDigital Library
- Sandra G. Hart and Lowell E. Staveland. 1988. Development of NASA-TLX (task load index): Results of empirical and theoretical research. In Human Mental Workload, P. A. Hancock and N. Meshkati (Eds.). Elsevier, 139--183.Google Scholar
- Sabine Hennig, Fritz Zobl, and Wolfgang W. Wasserburger. 2017. Accessible web maps for visually impaired users: Recommendations and example solutions. Cartogr. Perspect. 88 (November 2017), 6--27.Google Scholar
- Lars Erik Holmquist, Johan Redström, and Peter Ljungstrand. 1999. Token-based acces to digital information. In Proceedings of the Handheld and Ubiquitous Computing: First International Symposium (HUC’99), Karlsruhe, Germany, September 27--29. Springer, Berlin, 234--245.Google ScholarDigital Library
- Kasper Hornbaek, Benjamin B. Bederson, and Catherine Plaisant. 2003. Navigation patterns 8 usability of zoomable user interfaces. Interactions 10, 1 (2003), 11. Google ScholarDigital Library
- C. J. Huang, Ellen Yi-luen Do, and D. Gross. 2003. Mousehaus table: A physical interface for urban design. In Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology. 41--42.Google Scholar
- Hiroshi Ishii and Brygg Ullmer. 1997. Tangible bits: Towards seamless interfaces between people, bits and atoms. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems. 234--241. Google ScholarDigital Library
- Mihail Ivanchev, Francis Zinke, and Ulrike Lucke. 2014. Pre-journey visualization of travel routes for the blind on refreshable interactive tactile displays. In Proceedings of the International Conference on Computers Helping People with Special Needs (ICCHP’14), LNCS, vol. 8548. Springer International Publishing.Google ScholarCross Ref
- R. D. Jacobson. 1998. Navigating maps with little or no sight: An audio-tactile approach. In Proceedings of Content Visualization and Intermedia Representations. 95--102.Google Scholar
- Gunnar Jansson, Imre Juhasz, and Arina Cammilton. 2006. Reading virtual maps with a haptic mouse: Effects of some modifications of the tactile and audio-tactile information. Br. J. Vis. Impair. 24, 2 (2006), 60--66.Google ScholarCross Ref
- Hans-Christian Jetter, Svenja Leifert, Jens Gerken, Sören Schubert, and Harald Reiterer. 2012. Does (multi-)touch aid users’ spatial memory and navigation in “panning” and in “zooming 8 panning” UIs? In Proceedings of the International Working Conference on Advanced Visual Interfaces (AVI’12). ACM, 83--90. Google ScholarDigital Library
- Catherine Emma Jones and Valérie Maquil. 2016. Towards geospatial tangible user interfaces: An observational user study exploring geospatial interactions of the novice. Adv. Intel. Syst. Comput. 582 (2016), 104--123.Google Scholar
- Susanne Jul and George W. Furnas. 1998. Critical zones in desert fog: Aids to multiscale navigation. In Proceedings of the 11th Annual ACM Symposium on User Interface Software and Technology (UIST’98). 97--106. Google ScholarDigital Library
- Hesham M. Kamel and James A. Landay. 2002. Sketching images eyes-free. In Proceedings of the Fifth International ACM Conference on Assistive Technologies (Assets'02). New York, USA: ACM Press, 33--42. Google ScholarDigital Library
- Shaun K. Kane, Meredith Ringel Morris, Annuska Z. Perkins, Daniel Wigdor, Richard E. Ladner, and Jacob O. Wobbrock. 2011. Access overlays: Improving non-visual access to large touch screens for blind users. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST’11). ACM, 273--282. Google ScholarDigital Library
- Shaun K. S. K. Kane, Meredith Ringel M. R. Morris, and Jacob O. J. O. Wobbrock. 2013. Touchplates: Low-cost tactile overlays for visually impaired touch screen users. In Proceedings of the ASSETS’13 -- SIGACCESS International Conference on Computers and Accessibility. ACM. Google ScholarDigital Library
- R. M. Kitchin. 2000. Collecting and analysing cognitive mapping data. In Cognitive Mapping: Past, Present and Future, R. Kitchin and S. Freundschuh (Eds.). Routledge, 9--23.Google Scholar
- Roberta L. Klatzky, Nicholas A. Giudice, Christopher R. Bennett, and Jack M. Loomis. 2014. Touch-screen technology for the dynamic display of 2D spatial information without vision: Promise and progress. Multisens. Res. 27, 5--6 (2014), 359--378.Google ScholarCross Ref
- Roberta L. Klatzky, Jack M. Loomis, Susan J. Lederman, Hiromi Wake, and Naofumi Fujita. 1993. Haptic identification of objects and their depictions. Attention Percept. Psychophys. 54, 2 (1993), 170--178.Google ScholarCross Ref
- Harold W. Kuhn. 1955. The Hungarian method for the assignment problem. Naval Res. Logis. Q. 2, 1--2 (1955), 83--97.Google ScholarCross Ref
- Uwe Laufs, Christopher Ruff, and Jan Zibuschka. 2010. MT4j -- A Cross-platform multi-touch development framework. In Proceedings of the ACM EICS 2010, Workshop: Engineering Patterns for Multi-touch Interfaces. ACM, 52--57.Google Scholar
- J. M. Loomis, R. L. Klatzky, and Susan J. Lederman. 1991. Similarity of tactual and visual picture recognition with limited field of view. Perception 20, 2 (1991), 167--177.Google ScholarCross Ref
- Jack M. Loomis, Roberta L. Klatzky, and Nicolas A. Giudice. 2013. Representing 3D space in working memory: Spatial images from vision, hearing, touch, and language. In Multisensory Imagery, S. Lacey and R. Lawson (Eds.). Springer, New York, NY, 131--155.Google Scholar
- J. M. Loomis and R. L. Klatzky. 2008. Functional equivalence of spatial representations from vision, touch, and hearing: Relevance for sensory substitution. In Blindness and Brain Plasticity in Navigation and Object Perception, J. Rieser, D. Ashmead, F. Ebner, and A. Corn (Eds.). New York: Taylor 8 Francis, 155--184.Google Scholar
- Douglass Lee Mansur. 1984. Graphs in Sound: A Numerical Data Analysis Method for the Blind. Technical report. https://www.osti.gov/biblio/7044393.Google Scholar
- Valérie Maquil, Thomas Psik, and Ina Wagner. 2008. The colortable: A design story. In Proceedings of the 2nd International Conference on Tangible and Embedded Interaction (TEI’08). 97--104. Google ScholarDigital Library
- David McGookin, Euan Robertson, and Stephen S. A. Brewster. 2010. Clutching at straws: Using tangible interaction to provide non-visual access to graphs. In Proceedings of the 28th International Conference on Human Factors in Computing Systems (CHI’10). ACM, 1715--1724. Google ScholarDigital Library
- Susanna Millar and Zainab Al-Attar. 2001. Illusions in reading maps by touch: Reducing distance errors. Br. J. Psychol. 92, 4 (2001), 643--657.Google ScholarCross Ref
- Valerie S. Morash, Allison E. Connell Pensky, Steven T. W. Tseng, and Joshua A. Miele. 2014. Effects of using multiple hands and fingers on haptic performance in individuals who are blind. Perception 43, 6 (2014), 569--588.Google ScholarCross Ref
- Mathieu Nancel, Julie Wagner, Emmanuel Pietriga, Olivier Chapuis, and Wendy Mackay. 2011. Mid-air pan-and-zoom on wall-sized displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 177--186. Google ScholarDigital Library
- Sile O'Modhrain, Nicholas A. Giudice, John A. Gardner, and Gordon E. Legge. 2015. Designing media for visually-impaired users of refreshable touch displays: Possibilities and pitfalls. IEEE Trans. Haptics 8, 3 (2015), 248--257.Google ScholarDigital Library
- A. Ozgür, S. Lemaignan, W. Johal, M. Beltran, M. Briod, L. Pereyre, and P. Dillenbourg. 2017. Cellulo: Versatile handheld robots for education. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction (HRI'17). 119--127. Google ScholarDigital Library
- Hariprasath Palani, Uro Giudice, and Nicholas A. Giudice. 2016. Evaluation of non-visual zooming operations on Touchscreen devices. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Springer International Publishing, 162--174.Google Scholar
- Gian Antonio Pangaro, Dan Maynes-Aminzade, and Hiroshi Ishii. 2003. The actuated workbench: 2D actuation in tabletop tangible interfaces. Interfaces 4, 2 (2003), 181--190.Google Scholar
- James Patten and Hiroshi Ishii. 2007. Mechanical constraints as computational constraints in tabletop tangible interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’07). 809. Google ScholarDigital Library
- Esben Warming Pedersen and Kasper Hornbæk. 2011. Tangible Bots : Interaction with active tangibles in tabletop interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2975--2984. Google ScholarDigital Library
- Benjamin Poppinga, Charlotte Magnusson, Martin Pielot, and Kirsten Rassmus-Gröhn. 2011. TouchOver map: Audio-tactile exploration of interactive maps. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services (MobileHCI’11). ACM, 545--550. Google ScholarDigital Library
- Ivan Poupyrev, T. Nashida, and M. Okabe. 2007. Actuation and tangible user interfaces: The Vaucanson duck, robots, and shape displays. In Proceedings of the 1st International Conference on Tangible and Embedded Interaction. 205--212. Google ScholarDigital Library
- Roman Rädle, Hans-Christian Jetter, Simon Butscher, and Harald Reiterer. 2013. The effect of egocentric body movements on users’ navigation performance and spatial memory in zoomable user interfaces. In Proceedings of the 2013 ACM International Conference on Interactive Tabletops and Surfaces (ITS’13). 23--32. Google ScholarDigital Library
- Ravi Rastogi, T. V. Dianne Pawluk, and Jessica Ketchum. 2013. Intuitive tactile zooming for graphics accessed by individuals who are blind and visually impaired. IEEE Trans. Neural Syst. Rehab. Eng.: A Publ. IEEE Eng. Med. Biol. Soc. 21, 4 (2013), 655--63.Google ScholarCross Ref
- Carlo Ratti, Yao Wang, Hiroshi Ishii, Ben Piper, and Dennis Frenchman. 2004. Tangible user interfaces (TUIs): A novel paradigm for GIS. Trans. GIS 8, 4 (2004), 407--421.Google ScholarCross Ref
- Eckard Riedenklau. 2016. Development of actuated tangible user interfaces: New interaction concepts and evaluation methods. Dissertation for degree of Dr-Ing. Faculty of Technology at the Bielefeld University.Google Scholar
- Eckard Riedenklau, Thomas Hermann, and Helge Ritter. 2010. Tangible active objects and interactive sonification as a scatter plot alternative for the visually impaired. In Proceedings of the 16th International Conference on Auditory Display (ICAD’10). 1--7.Google Scholar
- Eckard Riedenklau, Thomas Hermann, and Helge Ritter. 2012. An integrated multi-modal actuated tangible user interface for distributed collaborative planning. In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction. 169--174. Google ScholarDigital Library
- André Rodrigues, André Santos, Kyle Montague, and Tiago Guerreiro. 2017. Improving smartphone accessibility with personalizable static overlays. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 37--41. Google ScholarDigital Library
- Jaime Sánchez and Natalia de la Torre. 2010. Autonomous navigation through the city for the blind. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility. 195--202. Google ScholarDigital Library
- Jeff Sauro and James R. Lewis. 2010. Average task times in usability tests: What to report? In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2347--2350. Google ScholarDigital Library
- Bernhard Schmitz and Thomas Ertl. 2010. Making digital maps accessible using vibrations. In Proceedings of the International Conference on Computers Helping People with Special (ICCHP’10), Part I. LNCS, vol. 6179, K. Miesenberger, J. Klaus, W. Zagler, and A. Karshmer (Eds.). Springer, 100--107. Google ScholarDigital Library
- Bernhard Schmitz and Thomas Ertl. 2012. Interactively displaying maps on a tactile graphics display. In Proceedings of the Spatial Knowledge Acquisition with Limited Information Displays (SKALID’12). 13--18.Google Scholar
- Jamie Snape, Jur Van Den Berg, Stephen J. Guy, and Dinesh Manocha. 2011. The hybrid reciprocal velocity obstacle. IEEE Trans. Robot. 27, 4 (2011), 696--706. Google ScholarDigital Library
- Saiganesh Swaminathan, Thijs Roumen, Robert Kovacs, David Stangl, Stefanie Mueller, and Patrick Baudisch. 2016. Linespace: A sensemaking platform for the blind. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI’16. ACM, 2175--2185. Google ScholarDigital Library
- Waldo R. Tobler. 1994. Bidimensional regression. Geogr. Anal. 26, 3 (1994), 187--212.Google ScholarCross Ref
- Barbara Tversky. 1981. Distortions in memory for maps. Cogn. Psychol. 13, 3 (1981), 407--433.Google ScholarCross Ref
- J. Underkoffler and H. Ishii. 1999. Urp: A luminous-tangible workbench for urban planning and design. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 386--383. Google ScholarDigital Library
- Simon Ungar. 2000. Cognitive mapping without visual experience. In Cognitive Mapping: Past, Present and Future, R. Kitchin and S. Freundschuh (Eds.). Routledge, Oxon, UK, 221--248.Google Scholar
- Simon Ungar, Mark Blades, and Christopher Spencer. 1997. Teaching visually impaired children to make distance judgement from a tactile map. J. Vis. Impair. Blindness 91, 2 (1997), 163--174.Google ScholarCross Ref
- Simon Ungar, Mark Blades, and Christopher Spencer. 1995. Mental rotation of a tactile layout by young visually impaired children. Perception 24, 8 (1995), 891--900.Google ScholarCross Ref
- Simon Ungar, Mark Blades, and Christopher Spencer. 1997. Teaching visually impaired children to make distance judgments from a tactile map. J. Vis. Impair. Blindness 91, 163--174.Google ScholarCross Ref
- Steven Wall and Stephen Brewster. 2006. Feeling what you hear: Tactile feedback for navigation of audio graphs. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1123--1132. Google ScholarDigital Library
- Maarten W. A. Wijntjes, Thijs van Lienen, Ilse M. Verstijnen, and Astrid M. L. Kappers. 2008. The influence of picture size on recognition and exploratory behaviour in raised-line drawings. Perception 37, 4 (2008), 602--614.Google ScholarCross Ref
- Limin Zeng and Gerhard Weber. 2012. ATMap: Annotated tactile maps for the visually impaired. In Cognitive Behavioural Systems: COST 2102 International Training School, LNCS, vol. 7403. Springer, Berlin, 290--298. Google ScholarDigital Library
- Haixia Zhao, Catherine Plaisant, Ben Shneiderman, and Jonathan Lazar. 2008. Data sonification for users with visual impairment. ACM Trans. Comput.-Hum. Interact. 15, 1 (2008), 1--28. Google ScholarDigital Library
Index Terms
- BotMap: Non-Visual Panning and Zooming with an Actuated Tabletop Tangible Interface
Recommendations
Clutching at straws: using tangible interaction to provide non-visual access to graphs
CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsWe present a tangible user interface (TUI) called Tangible Graph Builder, that has been designed to allow visually impaired users to access graph and chart-based data. We describe the current paper-based materials used to allow independent graph ...
Comparing Tangible and Multi-touch Interaction for Interactive Data Visualization Tasks
TEI '16: Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied InteractionInteractive visualization plays a key role in the analysis of large datasets. It can help users to explore data, investigate hypotheses and find patterns. The easier and more tangible the interaction, the more likely it is to enhance understanding. This ...
Design and user satisfaction of interactive maps for visually impaired people
ICCHP'12: Proceedings of the 13th international conference on Computers Helping People with Special Needs - Volume Part IIMultimodal interactive maps are a solution for presenting spatial information to visually impaired people. In this paper, we present an interactive multimodal map prototype that is based on a tactile paper map, a multi-touch screen and audio output. We ...
Comments