skip to main content
10.1145/3170427.3188513acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
abstract

Counterpoint: Exploring Mixed-Scale Gesture Interaction for AR Applications

Published:20 April 2018Publication History

ABSTRACT

This paper presents ongoing work on a design exploration for mixed-scale gestures, which interleave microgestures with larger gestures for computer interaction. We describe three prototype applications that show various facets of this multi-dimensional design space. These applications portray various tasks on a Hololens Augmented Reality display, using different combinations of wearable sensors. Future work toward expanding the design space and exploration is discussed, along with plans toward evaluation of mixed-scale gesture design.

Skip Supplemental Material Section

Supplemental Material

lbw1233-file3.mp4

mp4

10.4 MB

References

  1. David Ahlström, Khalad Hasan, and Pourang Irani. 2014. Are you comfortable doing that? In Proceedings of the 16th international conference on Humancomputer interaction with mobile devices & services (MobileHCI '14), 193--202. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Richard a Bolt. 1980. "Put-that-there": Voice and gesture at the graphics interface. Proceedings of the 7th annual conference on Computer graphics and interactive techniques (SIGGRAPH '80), 262--270. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bill Buxton. 1986. Chunking and phrasing and the design of human-computer dialogues. Proceedings of the IFIP World Computer Congress, 475--480.Google ScholarGoogle Scholar
  4. D Efron. 1941. Gesture and environment. Retrieved January 15, 2018 from http://psycnet.apa.org/psycinfo/1942-00254-000Google ScholarGoogle Scholar
  5. Paul Ekman and Wallace V. Friesen. 1969. The Repertoire of Nonverbal Behavior: Categories, Origins, Usage, and Coding. Semiotica 1, 1: 49--98.Google ScholarGoogle ScholarCross RefCross Ref
  6. Barrett Ens, Ahmad Byagowi, Teng Han, Juan David Hincapié-Ramos, and Pourang Irani. 2016. Combining ring input with hand tracking for precise, natural interaction with spatial analytic interfaces. In Proceedings of the 2016 Symposium on Spatial User Interaction (SUI '16), 99--102. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Barrett Ens, Aaron Quigley, Hui-Shyong Yeo, Pourang Irani, Thammathip Piumsomboon, and Mark Billinghurst. 2017. Exploring mixed-scale gesture interaction. In SIGGRAPH Asia 2017 Posters (SA '17), 1--2. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Google. Project Soli. Retrieved September 14, 2017 from https://atap.google.com/soli/Google ScholarGoogle Scholar
  9. Juan David Hincapié-Ramos, Xiang Guo, Paymahn Moghadasian, and Pourang Irani. 2014. Consumed Endurance: A metric to quantify arm fatigue of mid-air interactions. Proceedings of the 32nd annual ACM conference on Human factors in computing systems (CHI '14), 1063--1072. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Adam Kendon. 1986. Current issues in the study of gesture. The Biological Foundations of Gestures: Motor and Semiotic Aspects 1: 23--47.Google ScholarGoogle Scholar
  11. Leap Motion. Leap Motion. Retrieved September 14, 2017 from https://www.leapmotion.com/Google ScholarGoogle Scholar
  12. Jaime Lien, Nicholas Gillian, M. Emre Karagozler, Patrick Amihood, Carsten Schwesig, Erik Olson, Hakim Raja, and Ivan Poupyrev. 2016. Soli: Ubiquitous gesture sensing with millimeter wave radar. ACM Transactions on Graphics (TOG) 35, 4: 1--19. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Mingyu Liu, Mathieu Nancel, and Daniel Vogel. 2015. Gunslinger: Subtle arms-down mid-air interaction. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15), 63-- 71. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Microsoft HoloLens. Retrieved September 14, 2017 from https://www.microsoft.com/en-us/hololensGoogle ScholarGoogle Scholar
  15. Thammathip Piumsomboon, Adrian Clark, Mark Billinghurst, and Andy Cockburn. 2013. User-defined gestures for augmented reality. Human-Computer Interaction (INTERACT '13). Lecture Notes in Computer Science 8118: 282--299. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Ivan Poupyrev, Mark Billinghurst, Suzanne Weghorst, and Tadao Ichikawa. 1996. The go-go interaction technique. Proceedings of the 9th annual ACM symposium on User interface software and technology (UIST '96), 79--80. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Francis Quek, David McNeill, Robert Bryll, Susan Duncan, Xin-Feng Ma, Cemil Kirbas, Karl E. McCullough, and Rashid Ansari. 2002. Multimodal human discourse: Gesture and speech. ACM Trans. Comput.-Hum. Interact. 9, 3: 171--193. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. J M Rehg and T Kanade. 1994. Visual tracking of high DOF articulated structures: An application to human hand tracking. Proceedings of the European Conference on Computer Vision (ECCV), 35--46. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Jaime Ruiz, Yang Li, and Edward Lank. 2011. Userdefined motion gestures for mobile interaction. Proceedings of the 2011 annual conference on Human factors in computing systems (CHI '11), 197--206. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Alan Daniel Wexelblat. 1994. A Feature-Based Approach to Continuous-Gesture Analysis. PhD Dissertation. Massachusetts Institute of Technology (MIT), Cambridge, MA.Google ScholarGoogle Scholar
  21. Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-defined gestures for surface computing. Proceedings of the 27th international conference on Human factors in computing systems (CHI '09) 1083--1092. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Katrin Wolf, Anja Naumann, Michael Rohs, and Jörg Müller. 2011. A taxonomy of microinteractions: Defining microgestures based on ergonomic and scenario-Dependent requirements. Human-Computer Interaction (INTERACT '13). Lecture Notes in Computer Science 6946, 559--575. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Counterpoint: Exploring Mixed-Scale Gesture Interaction for AR Applications

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI EA '18: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems
      April 2018
      3155 pages
      ISBN:9781450356213
      DOI:10.1145/3170427

      Copyright © 2018 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 20 April 2018

      Check for updates

      Qualifiers

      • abstract

      Acceptance Rates

      CHI EA '18 Paper Acceptance Rate1,208of3,955submissions,31%Overall Acceptance Rate6,164of23,696submissions,26%

      Upcoming Conference

      CHI '24
      CHI Conference on Human Factors in Computing Systems
      May 11 - 16, 2024
      Honolulu , HI , USA

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader