skip to main content
research-article

BotMap: Non-Visual Panning and Zooming with an Actuated Tabletop Tangible Interface

Published:07 September 2018Publication History
Skip Abstract Section

Abstract

The development of novel shape-changing or actuated tabletop tangible interfaces opens new perspectives for the design of physical and dynamic maps, especially for visually impaired (VI) users. Such maps would allow non-visual haptic exploration with advanced functions, such as panning and zooming. In this study, we designed an actuated tangible tabletop interface, called BotMap, allowing the exploration of geographic data through non-visual panning and zooming. In BotMap, small robots represent landmarks and move to their correct position whenever the map is refreshed. Users can interact with the robots to retrieve the names of the landmarks they represent. We designed two interfaces, named Keyboard and Sliders, which enable users to pan and zoom. Two evaluations were conducted with, respectively, ten blindfolded and eight VI participants. Results show that both interfaces were usable, with a slight advantage for the Keyboard interface in terms of navigation performance and map comprehension, and that, even when many panning and zooming operations were required, VI participants were able to understand the maps. Most participants managed to accurately reconstruct maps after exploration. Finally, we observed three VI people using the system and performing a classical task consisting in finding the more appropriate itinerary for a journey.

References

  1. Sandra Bardot, Marcos Serrano, and Christophe Jouffrais. 2016. From tactile to virtual: Using a smartwatch to improve spatial map exploration for visually impaired users. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI’16). ACM, 100--111. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. SE Battersby. 2008. User-centered design for digital map navigation tools. In Proceedings of The 17th International Research Symposium on Computer-based Cartography. 1--15.Google ScholarGoogle Scholar
  3. Benjamin B. Bederson. 2011. The promise of zoomable user interfaces. Behav. Inform. Technol. 30, 6 (2011), 853--866.Google ScholarGoogle ScholarCross RefCross Ref
  4. Edward P. Berla and Lawrence H. Butterfield. 1977. Tactual distinctive features analysis: Training blind students in shape recognition and in locating shapes on a map. J. Special Edu. 11, 3 (1977), 335--346.Google ScholarGoogle ScholarCross RefCross Ref
  5. Anke M. Brock, Philippe Truillet, Bernard Oriola, Delphine Picard, and Christophe Jouffrais. 2015. Interactivity improves usability of geographic maps for visually impaired people. Hum.--Comput. Interact. 30 (2015), 156--194. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. John Brooke. 1996. SUS: A “quick and dirty” usability scale. In Usability Evaluation in Industry, P. W. Jordan, B. Thomas, B. A. Weerdmeester, and I. L. McClelland (Eds.). Taylor 8 Francis, London, UK, 189--194.Google ScholarGoogle Scholar
  7. Geoff Cumming. 2013. The new statistics why and how. Psychol. Sci. 25 (2013), 7--29.Google ScholarGoogle ScholarCross RefCross Ref
  8. Franco Delogu, Massimiliano Palmiero, Stefano Federici, Catherine Plaisant, Haixia Zhao, and Olivetti Belardinelli. 2010. Non-visual exploration of geographic maps: Does sonification help? Disab. Rehab.: Assistive Technol. 5, 3 (2010), 164--174.Google ScholarGoogle ScholarCross RefCross Ref
  9. Pierre Dragicevic. 2015. HCI statistics without p-values. {Research Report} RR-8738, Inria, 32 pages.Google ScholarGoogle Scholar
  10. Julie Ducasse, Anke Brock, and Christophe Jouffrais. 2018. Accessible interactive maps for visually impaired users. In Mobility in Visually Impaired People -- Fundamentals and ICT Assistive Technologies, E. Pissaloux and R. Velasquez (Eds.). Springer, 537--584.Google ScholarGoogle Scholar
  11. Julie Ducasse, Marc J.-M. Macé, Marcos Serrano, and Christophe Jouffrais. 2016. Tangible reels: Construction and exploration of tangible maps by visually impaired users. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 2186--2197. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. F. De Felice, F. Renna, G. Attolico, and A. Distante. 2007. A haptic/acoustic application to allow blind the access to spatial information. In Proceedings of the Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (WHC’07). IEEE, 310--315. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Martin J. Gardner and Douglas G. Altman. 1986. Confidence intervals rather than P values: Estimation rather than hypothesis testing. Br. Med. J. 292, 6522 (1986), 746--750.Google ScholarGoogle ScholarCross RefCross Ref
  14. Gary R. VandenBos (Ed.) APA. 2010. Manual of the American Psychological Association. (A. P. Association, Ed.).Google ScholarGoogle Scholar
  15. F. Gaunet, J. L. Martinez, and C. Thinus-Blanc. 1997. Early-blind subjects’ spatial representation of manipulatory space: Exploratory strategies and reaction to change. Perception 26, 3 (1997), 345--366.Google ScholarGoogle ScholarCross RefCross Ref
  16. Mathieu Le Goc, Lawrence H. Kim, Ali Parsaei, Jean-daniel Fekete, Pierre Dragicevic, and Sean Follmer. 2016. Zooids: Building blocks for swarm user interfaces. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST’16). ACM, 97--109. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Yves Guiard and Michel Beaudouin-Lafon. 2004. Target acquisition in multiscale electronic worlds. Int. J. Hum.-Comput. Stud. 61, 6 (2004), 875--905. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Sandra G. Hart and Lowell E. Staveland. 1988. Development of NASA-TLX (task load index): Results of empirical and theoretical research. In Human Mental Workload, P. A. Hancock and N. Meshkati (Eds.). Elsevier, 139--183.Google ScholarGoogle Scholar
  19. Sabine Hennig, Fritz Zobl, and Wolfgang W. Wasserburger. 2017. Accessible web maps for visually impaired users: Recommendations and example solutions. Cartogr. Perspect. 88 (November 2017), 6--27.Google ScholarGoogle Scholar
  20. Lars Erik Holmquist, Johan Redström, and Peter Ljungstrand. 1999. Token-based acces to digital information. In Proceedings of the Handheld and Ubiquitous Computing: First International Symposium (HUC’99), Karlsruhe, Germany, September 27--29. Springer, Berlin, 234--245.Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Kasper Hornbaek, Benjamin B. Bederson, and Catherine Plaisant. 2003. Navigation patterns 8 usability of zoomable user interfaces. Interactions 10, 1 (2003), 11. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. C. J. Huang, Ellen Yi-luen Do, and D. Gross. 2003. Mousehaus table: A physical interface for urban design. In Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology. 41--42.Google ScholarGoogle Scholar
  23. Hiroshi Ishii and Brygg Ullmer. 1997. Tangible bits: Towards seamless interfaces between people, bits and atoms. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems. 234--241. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Mihail Ivanchev, Francis Zinke, and Ulrike Lucke. 2014. Pre-journey visualization of travel routes for the blind on refreshable interactive tactile displays. In Proceedings of the International Conference on Computers Helping People with Special Needs (ICCHP’14), LNCS, vol. 8548. Springer International Publishing.Google ScholarGoogle ScholarCross RefCross Ref
  25. R. D. Jacobson. 1998. Navigating maps with little or no sight: An audio-tactile approach. In Proceedings of Content Visualization and Intermedia Representations. 95--102.Google ScholarGoogle Scholar
  26. Gunnar Jansson, Imre Juhasz, and Arina Cammilton. 2006. Reading virtual maps with a haptic mouse: Effects of some modifications of the tactile and audio-tactile information. Br. J. Vis. Impair. 24, 2 (2006), 60--66.Google ScholarGoogle ScholarCross RefCross Ref
  27. Hans-Christian Jetter, Svenja Leifert, Jens Gerken, Sören Schubert, and Harald Reiterer. 2012. Does (multi-)touch aid users’ spatial memory and navigation in “panning” and in “zooming 8 panning” UIs? In Proceedings of the International Working Conference on Advanced Visual Interfaces (AVI’12). ACM, 83--90. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Catherine Emma Jones and Valérie Maquil. 2016. Towards geospatial tangible user interfaces: An observational user study exploring geospatial interactions of the novice. Adv. Intel. Syst. Comput. 582 (2016), 104--123.Google ScholarGoogle Scholar
  29. Susanne Jul and George W. Furnas. 1998. Critical zones in desert fog: Aids to multiscale navigation. In Proceedings of the 11th Annual ACM Symposium on User Interface Software and Technology (UIST’98). 97--106. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Hesham M. Kamel and James A. Landay. 2002. Sketching images eyes-free. In Proceedings of the Fifth International ACM Conference on Assistive Technologies (Assets'02). New York, USA: ACM Press, 33--42. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Shaun K. Kane, Meredith Ringel Morris, Annuska Z. Perkins, Daniel Wigdor, Richard E. Ladner, and Jacob O. Wobbrock. 2011. Access overlays: Improving non-visual access to large touch screens for blind users. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST’11). ACM, 273--282. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Shaun K. S. K. Kane, Meredith Ringel M. R. Morris, and Jacob O. J. O. Wobbrock. 2013. Touchplates: Low-cost tactile overlays for visually impaired touch screen users. In Proceedings of the ASSETS’13 -- SIGACCESS International Conference on Computers and Accessibility. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. R. M. Kitchin. 2000. Collecting and analysing cognitive mapping data. In Cognitive Mapping: Past, Present and Future, R. Kitchin and S. Freundschuh (Eds.). Routledge, 9--23.Google ScholarGoogle Scholar
  34. Roberta L. Klatzky, Nicholas A. Giudice, Christopher R. Bennett, and Jack M. Loomis. 2014. Touch-screen technology for the dynamic display of 2D spatial information without vision: Promise and progress. Multisens. Res. 27, 5--6 (2014), 359--378.Google ScholarGoogle ScholarCross RefCross Ref
  35. Roberta L. Klatzky, Jack M. Loomis, Susan J. Lederman, Hiromi Wake, and Naofumi Fujita. 1993. Haptic identification of objects and their depictions. Attention Percept. Psychophys. 54, 2 (1993), 170--178.Google ScholarGoogle ScholarCross RefCross Ref
  36. Harold W. Kuhn. 1955. The Hungarian method for the assignment problem. Naval Res. Logis. Q. 2, 1--2 (1955), 83--97.Google ScholarGoogle ScholarCross RefCross Ref
  37. Uwe Laufs, Christopher Ruff, and Jan Zibuschka. 2010. MT4j -- A Cross-platform multi-touch development framework. In Proceedings of the ACM EICS 2010, Workshop: Engineering Patterns for Multi-touch Interfaces. ACM, 52--57.Google ScholarGoogle Scholar
  38. J. M. Loomis, R. L. Klatzky, and Susan J. Lederman. 1991. Similarity of tactual and visual picture recognition with limited field of view. Perception 20, 2 (1991), 167--177.Google ScholarGoogle ScholarCross RefCross Ref
  39. Jack M. Loomis, Roberta L. Klatzky, and Nicolas A. Giudice. 2013. Representing 3D space in working memory: Spatial images from vision, hearing, touch, and language. In Multisensory Imagery, S. Lacey and R. Lawson (Eds.). Springer, New York, NY, 131--155.Google ScholarGoogle Scholar
  40. J. M. Loomis and R. L. Klatzky. 2008. Functional equivalence of spatial representations from vision, touch, and hearing: Relevance for sensory substitution. In Blindness and Brain Plasticity in Navigation and Object Perception, J. Rieser, D. Ashmead, F. Ebner, and A. Corn (Eds.). New York: Taylor 8 Francis, 155--184.Google ScholarGoogle Scholar
  41. Douglass Lee Mansur. 1984. Graphs in Sound: A Numerical Data Analysis Method for the Blind. Technical report. https://www.osti.gov/biblio/7044393.Google ScholarGoogle Scholar
  42. Valérie Maquil, Thomas Psik, and Ina Wagner. 2008. The colortable: A design story. In Proceedings of the 2nd International Conference on Tangible and Embedded Interaction (TEI’08). 97--104. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. David McGookin, Euan Robertson, and Stephen S. A. Brewster. 2010. Clutching at straws: Using tangible interaction to provide non-visual access to graphs. In Proceedings of the 28th International Conference on Human Factors in Computing Systems (CHI’10). ACM, 1715--1724. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Susanna Millar and Zainab Al-Attar. 2001. Illusions in reading maps by touch: Reducing distance errors. Br. J. Psychol. 92, 4 (2001), 643--657.Google ScholarGoogle ScholarCross RefCross Ref
  45. Valerie S. Morash, Allison E. Connell Pensky, Steven T. W. Tseng, and Joshua A. Miele. 2014. Effects of using multiple hands and fingers on haptic performance in individuals who are blind. Perception 43, 6 (2014), 569--588.Google ScholarGoogle ScholarCross RefCross Ref
  46. Mathieu Nancel, Julie Wagner, Emmanuel Pietriga, Olivier Chapuis, and Wendy Mackay. 2011. Mid-air pan-and-zoom on wall-sized displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 177--186. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Sile O'Modhrain, Nicholas A. Giudice, John A. Gardner, and Gordon E. Legge. 2015. Designing media for visually-impaired users of refreshable touch displays: Possibilities and pitfalls. IEEE Trans. Haptics 8, 3 (2015), 248--257.Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. A. Ozgür, S. Lemaignan, W. Johal, M. Beltran, M. Briod, L. Pereyre, and P. Dillenbourg. 2017. Cellulo: Versatile handheld robots for education. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction (HRI'17). 119--127. Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Hariprasath Palani, Uro Giudice, and Nicholas A. Giudice. 2016. Evaluation of non-visual zooming operations on Touchscreen devices. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Springer International Publishing, 162--174.Google ScholarGoogle Scholar
  50. Gian Antonio Pangaro, Dan Maynes-Aminzade, and Hiroshi Ishii. 2003. The actuated workbench: 2D actuation in tabletop tangible interfaces. Interfaces 4, 2 (2003), 181--190.Google ScholarGoogle Scholar
  51. James Patten and Hiroshi Ishii. 2007. Mechanical constraints as computational constraints in tabletop tangible interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’07). 809. Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. Esben Warming Pedersen and Kasper Hornbæk. 2011. Tangible Bots : Interaction with active tangibles in tabletop interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2975--2984. Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. Benjamin Poppinga, Charlotte Magnusson, Martin Pielot, and Kirsten Rassmus-Gröhn. 2011. TouchOver map: Audio-tactile exploration of interactive maps. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services (MobileHCI’11). ACM, 545--550. Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. Ivan Poupyrev, T. Nashida, and M. Okabe. 2007. Actuation and tangible user interfaces: The Vaucanson duck, robots, and shape displays. In Proceedings of the 1st International Conference on Tangible and Embedded Interaction. 205--212. Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. Roman Rädle, Hans-Christian Jetter, Simon Butscher, and Harald Reiterer. 2013. The effect of egocentric body movements on users’ navigation performance and spatial memory in zoomable user interfaces. In Proceedings of the 2013 ACM International Conference on Interactive Tabletops and Surfaces (ITS’13). 23--32. Google ScholarGoogle ScholarDigital LibraryDigital Library
  56. Ravi Rastogi, T. V. Dianne Pawluk, and Jessica Ketchum. 2013. Intuitive tactile zooming for graphics accessed by individuals who are blind and visually impaired. IEEE Trans. Neural Syst. Rehab. Eng.: A Publ. IEEE Eng. Med. Biol. Soc. 21, 4 (2013), 655--63.Google ScholarGoogle ScholarCross RefCross Ref
  57. Carlo Ratti, Yao Wang, Hiroshi Ishii, Ben Piper, and Dennis Frenchman. 2004. Tangible user interfaces (TUIs): A novel paradigm for GIS. Trans. GIS 8, 4 (2004), 407--421.Google ScholarGoogle ScholarCross RefCross Ref
  58. Eckard Riedenklau. 2016. Development of actuated tangible user interfaces: New interaction concepts and evaluation methods. Dissertation for degree of Dr-Ing. Faculty of Technology at the Bielefeld University.Google ScholarGoogle Scholar
  59. Eckard Riedenklau, Thomas Hermann, and Helge Ritter. 2010. Tangible active objects and interactive sonification as a scatter plot alternative for the visually impaired. In Proceedings of the 16th International Conference on Auditory Display (ICAD’10). 1--7.Google ScholarGoogle Scholar
  60. Eckard Riedenklau, Thomas Hermann, and Helge Ritter. 2012. An integrated multi-modal actuated tangible user interface for distributed collaborative planning. In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction. 169--174. Google ScholarGoogle ScholarDigital LibraryDigital Library
  61. André Rodrigues, André Santos, Kyle Montague, and Tiago Guerreiro. 2017. Improving smartphone accessibility with personalizable static overlays. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 37--41. Google ScholarGoogle ScholarDigital LibraryDigital Library
  62. Jaime Sánchez and Natalia de la Torre. 2010. Autonomous navigation through the city for the blind. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility. 195--202. Google ScholarGoogle ScholarDigital LibraryDigital Library
  63. Jeff Sauro and James R. Lewis. 2010. Average task times in usability tests: What to report? In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2347--2350. Google ScholarGoogle ScholarDigital LibraryDigital Library
  64. Bernhard Schmitz and Thomas Ertl. 2010. Making digital maps accessible using vibrations. In Proceedings of the International Conference on Computers Helping People with Special (ICCHP’10), Part I. LNCS, vol. 6179, K. Miesenberger, J. Klaus, W. Zagler, and A. Karshmer (Eds.). Springer, 100--107. Google ScholarGoogle ScholarDigital LibraryDigital Library
  65. Bernhard Schmitz and Thomas Ertl. 2012. Interactively displaying maps on a tactile graphics display. In Proceedings of the Spatial Knowledge Acquisition with Limited Information Displays (SKALID’12). 13--18.Google ScholarGoogle Scholar
  66. Jamie Snape, Jur Van Den Berg, Stephen J. Guy, and Dinesh Manocha. 2011. The hybrid reciprocal velocity obstacle. IEEE Trans. Robot. 27, 4 (2011), 696--706. Google ScholarGoogle ScholarDigital LibraryDigital Library
  67. Saiganesh Swaminathan, Thijs Roumen, Robert Kovacs, David Stangl, Stefanie Mueller, and Patrick Baudisch. 2016. Linespace: A sensemaking platform for the blind. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI’16. ACM, 2175--2185. Google ScholarGoogle ScholarDigital LibraryDigital Library
  68. Waldo R. Tobler. 1994. Bidimensional regression. Geogr. Anal. 26, 3 (1994), 187--212.Google ScholarGoogle ScholarCross RefCross Ref
  69. Barbara Tversky. 1981. Distortions in memory for maps. Cogn. Psychol. 13, 3 (1981), 407--433.Google ScholarGoogle ScholarCross RefCross Ref
  70. J. Underkoffler and H. Ishii. 1999. Urp: A luminous-tangible workbench for urban planning and design. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 386--383. Google ScholarGoogle ScholarDigital LibraryDigital Library
  71. Simon Ungar. 2000. Cognitive mapping without visual experience. In Cognitive Mapping: Past, Present and Future, R. Kitchin and S. Freundschuh (Eds.). Routledge, Oxon, UK, 221--248.Google ScholarGoogle Scholar
  72. Simon Ungar, Mark Blades, and Christopher Spencer. 1997. Teaching visually impaired children to make distance judgement from a tactile map. J. Vis. Impair. Blindness 91, 2 (1997), 163--174.Google ScholarGoogle ScholarCross RefCross Ref
  73. Simon Ungar, Mark Blades, and Christopher Spencer. 1995. Mental rotation of a tactile layout by young visually impaired children. Perception 24, 8 (1995), 891--900.Google ScholarGoogle ScholarCross RefCross Ref
  74. Simon Ungar, Mark Blades, and Christopher Spencer. 1997. Teaching visually impaired children to make distance judgments from a tactile map. J. Vis. Impair. Blindness 91, 163--174.Google ScholarGoogle ScholarCross RefCross Ref
  75. Steven Wall and Stephen Brewster. 2006. Feeling what you hear: Tactile feedback for navigation of audio graphs. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1123--1132. Google ScholarGoogle ScholarDigital LibraryDigital Library
  76. Maarten W. A. Wijntjes, Thijs van Lienen, Ilse M. Verstijnen, and Astrid M. L. Kappers. 2008. The influence of picture size on recognition and exploratory behaviour in raised-line drawings. Perception 37, 4 (2008), 602--614.Google ScholarGoogle ScholarCross RefCross Ref
  77. Limin Zeng and Gerhard Weber. 2012. ATMap: Annotated tactile maps for the visually impaired. In Cognitive Behavioural Systems: COST 2102 International Training School, LNCS, vol. 7403. Springer, Berlin, 290--298. Google ScholarGoogle ScholarDigital LibraryDigital Library
  78. Haixia Zhao, Catherine Plaisant, Ben Shneiderman, and Jonathan Lazar. 2008. Data sonification for users with visual impairment. ACM Trans. Comput.-Hum. Interact. 15, 1 (2008), 1--28. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. BotMap: Non-Visual Panning and Zooming with an Actuated Tabletop Tangible Interface

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image ACM Transactions on Computer-Human Interaction
      ACM Transactions on Computer-Human Interaction  Volume 25, Issue 4
      August 2018
      170 pages
      ISSN:1073-0516
      EISSN:1557-7325
      DOI:10.1145/3266364
      Issue’s Table of Contents

      Copyright © 2018 ACM

      Publication rights licensed to ACM. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 7 September 2018
      • Accepted: 1 April 2018
      • Revised: 1 February 2018
      • Received: 1 July 2017
      Published in tochi Volume 25, Issue 4

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format