ABSTRACT
This paper presents ongoing work on a design exploration for mixed-scale gestures, which interleave microgestures with larger gestures for computer interaction. We describe three prototype applications that show various facets of this multi-dimensional design space. These applications portray various tasks on a Hololens Augmented Reality display, using different combinations of wearable sensors. Future work toward expanding the design space and exploration is discussed, along with plans toward evaluation of mixed-scale gesture design.
Supplemental Material
- David Ahlström, Khalad Hasan, and Pourang Irani. 2014. Are you comfortable doing that? In Proceedings of the 16th international conference on Humancomputer interaction with mobile devices & services (MobileHCI '14), 193--202. Google ScholarDigital Library
- Richard a Bolt. 1980. "Put-that-there": Voice and gesture at the graphics interface. Proceedings of the 7th annual conference on Computer graphics and interactive techniques (SIGGRAPH '80), 262--270. Google ScholarDigital Library
- Bill Buxton. 1986. Chunking and phrasing and the design of human-computer dialogues. Proceedings of the IFIP World Computer Congress, 475--480.Google Scholar
- D Efron. 1941. Gesture and environment. Retrieved January 15, 2018 from http://psycnet.apa.org/psycinfo/1942-00254-000Google Scholar
- Paul Ekman and Wallace V. Friesen. 1969. The Repertoire of Nonverbal Behavior: Categories, Origins, Usage, and Coding. Semiotica 1, 1: 49--98.Google ScholarCross Ref
- Barrett Ens, Ahmad Byagowi, Teng Han, Juan David Hincapié-Ramos, and Pourang Irani. 2016. Combining ring input with hand tracking for precise, natural interaction with spatial analytic interfaces. In Proceedings of the 2016 Symposium on Spatial User Interaction (SUI '16), 99--102. Google ScholarDigital Library
- Barrett Ens, Aaron Quigley, Hui-Shyong Yeo, Pourang Irani, Thammathip Piumsomboon, and Mark Billinghurst. 2017. Exploring mixed-scale gesture interaction. In SIGGRAPH Asia 2017 Posters (SA '17), 1--2. Google ScholarDigital Library
- Google. Project Soli. Retrieved September 14, 2017 from https://atap.google.com/soli/Google Scholar
- Juan David Hincapié-Ramos, Xiang Guo, Paymahn Moghadasian, and Pourang Irani. 2014. Consumed Endurance: A metric to quantify arm fatigue of mid-air interactions. Proceedings of the 32nd annual ACM conference on Human factors in computing systems (CHI '14), 1063--1072. Google ScholarDigital Library
- Adam Kendon. 1986. Current issues in the study of gesture. The Biological Foundations of Gestures: Motor and Semiotic Aspects 1: 23--47.Google Scholar
- Leap Motion. Leap Motion. Retrieved September 14, 2017 from https://www.leapmotion.com/Google Scholar
- Jaime Lien, Nicholas Gillian, M. Emre Karagozler, Patrick Amihood, Carsten Schwesig, Erik Olson, Hakim Raja, and Ivan Poupyrev. 2016. Soli: Ubiquitous gesture sensing with millimeter wave radar. ACM Transactions on Graphics (TOG) 35, 4: 1--19. Google ScholarDigital Library
- Mingyu Liu, Mathieu Nancel, and Daniel Vogel. 2015. Gunslinger: Subtle arms-down mid-air interaction. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15), 63-- 71. Google ScholarDigital Library
- Microsoft HoloLens. Retrieved September 14, 2017 from https://www.microsoft.com/en-us/hololensGoogle Scholar
- Thammathip Piumsomboon, Adrian Clark, Mark Billinghurst, and Andy Cockburn. 2013. User-defined gestures for augmented reality. Human-Computer Interaction (INTERACT '13). Lecture Notes in Computer Science 8118: 282--299. Google ScholarDigital Library
- Ivan Poupyrev, Mark Billinghurst, Suzanne Weghorst, and Tadao Ichikawa. 1996. The go-go interaction technique. Proceedings of the 9th annual ACM symposium on User interface software and technology (UIST '96), 79--80. Google ScholarDigital Library
- Francis Quek, David McNeill, Robert Bryll, Susan Duncan, Xin-Feng Ma, Cemil Kirbas, Karl E. McCullough, and Rashid Ansari. 2002. Multimodal human discourse: Gesture and speech. ACM Trans. Comput.-Hum. Interact. 9, 3: 171--193. Google ScholarDigital Library
- J M Rehg and T Kanade. 1994. Visual tracking of high DOF articulated structures: An application to human hand tracking. Proceedings of the European Conference on Computer Vision (ECCV), 35--46. Google ScholarDigital Library
- Jaime Ruiz, Yang Li, and Edward Lank. 2011. Userdefined motion gestures for mobile interaction. Proceedings of the 2011 annual conference on Human factors in computing systems (CHI '11), 197--206. Google ScholarDigital Library
- Alan Daniel Wexelblat. 1994. A Feature-Based Approach to Continuous-Gesture Analysis. PhD Dissertation. Massachusetts Institute of Technology (MIT), Cambridge, MA.Google Scholar
- Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-defined gestures for surface computing. Proceedings of the 27th international conference on Human factors in computing systems (CHI '09) 1083--1092. Google ScholarDigital Library
- Katrin Wolf, Anja Naumann, Michael Rohs, and Jörg Müller. 2011. A taxonomy of microinteractions: Defining microgestures based on ergonomic and scenario-Dependent requirements. Human-Computer Interaction (INTERACT '13). Lecture Notes in Computer Science 6946, 559--575. Google ScholarDigital Library
Index Terms
- Counterpoint: Exploring Mixed-Scale Gesture Interaction for AR Applications
Recommendations
Exploring mixed-scale gesture interaction
SA '17: SIGGRAPH Asia 2017 PostersThis paper presents ongoing work toward a design exploration for combining microgestures with other types of gestures within the greater lexicon of gestures for computer interaction. We describe three prototype applications that show various facets of ...
Multi-scale gestural interaction for augmented reality
SA '17: SIGGRAPH Asia 2017 Mobile Graphics & Interactive ApplicationsWe present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to ...
Using 3D hand gestures and touch input for wearable AR interaction
CHI EA '14: CHI '14 Extended Abstracts on Human Factors in Computing SystemsWhile wearable devices have been developed that incorporate computing, sensing and display technology into a head-worn package, they often have limited input methods that might not be appropriate for natural 3D interaction which is necessary for ...
Comments