ABSTRACT
Interferi is an on-body gesture sensing technique using acoustic interferometry. We use ultrasonic transducers resting on the skin to create acoustic interference patterns inside the wearer's body, which interact with anatomical features in complex, yet characteristic ways. We focus on two areas of the body with great expressive power: the hands and face. For each, we built and tested a series of worn sensor configurations, which we used to identify useful transducer arrangements and machine learning fea-tures. We created final prototypes for the hand and face, which our study results show can support eleven- and nine-class gestures sets at 93.4% and 89.0% accuracy, re-spectively. We also evaluated our system in four continu-ous tracking tasks, including smile intensity and weight estimation, which never exceed 9.5% error. We believe these results show great promise and illuminate an inter-esting sensing technique for HCI applications.
Supplemental Material
- Karan Ahuja, Rahul Islam, Varun Parashar, Kuntal Dey, Chris Harrison, and Mayank Goel. 2018. EyeSpyVR: Interactive Eye Sensing Using Off-the-Shelf, Smartphone-Based VR Headsets. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2, 2, Article 57 (July 2018), 10 pages. Google ScholarDigital Library
- Brian Amento, Will Hill, and Loren Terveen. 2002. The sound of one hand: a wrist-mounted bio-acoustic fingertip gesture interface. In CHI '02 Extended Abstracts on Human Factors in Computing Systems (CHI EA '02). ACM, New York, NY, USA, 724--725. Google ScholarDigital Library
- Toshiyuki Ando, Yuki Kubo, Buntarou Shizuki, and Shin Takahashi. 2017. CanalSense: Face-Related Movement Recognition System based on Sensing Air Pressure in Ear Canals. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST '17). ACM, New York, NY, USA, 679--689. Google ScholarDigital Library
- E. H. Brandt (2001). Acoustic physics: Suspended by Sound. Nature, 413(6855), 474--475.Google Scholar
- Laura A. Brooks and Peter Gerstoft. Ocean acoustic interferometry. The Journal of the Acoustical Society of America 121, no. 6 (2007): 3377--3385.Google ScholarCross Ref
- Tom Carter, Sue Ann Seah, Benjamin Long, Bruce Drinkwater, and Sriram Subramanian. UltraHaptics: multi-point mid-air haptic feedback for touch surfaces. In Proceedings of the 26th annual ACM symposium on User interface software and technology, pp. 505--514. ACM, 2013. Google ScholarDigital Library
- Ciprian Adrian Corneanu, Marc Oliu Simón, Jeffrey F. Cohn, and Sergio Escalera Guerrero. Survey on RGB, 3D, thermal, and multimodal approaches for facial expression recognition: History, trends, and affect-related applications. IEEE transactions on pattern analysis and machine intelligence 38, no. 8 (2016): 1548--1568.Google Scholar
- Artem Dementyev and Joseph A. Paradiso. 2014. WristFlex: lowpower gesture input with wrist-worn pressure sensors. In Proceedings of the 27th annual ACM symposium on User interface software and technology (UIST '14). ACM, New York, NY, USA, 161--166. Google ScholarDigital Library
- Travis Deyle, Szabolcs Palinko, Erika Shehan Poole, and Thad Starner. 2007. Hambone: A Bio-Acoustic Gesture Interface. In Proceedings of the 2007 11th IEEE International Symposium on Wearable Computers (ISWC '07). IEEE Computer Society, Washington, DC, USA, 1--8. Google ScholarDigital Library
- EMCO SIP100 DC-DC Converter, http://www.eie-ic.com/Images/EMCO/EMCO/sipseries.pdfGoogle Scholar
- Jun Gong, Xing-Dong Yang, and Pourang Irani. 2016. WristWhirl: One-handed Continuous Smartwatch Input using Wrist Gestures. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16). ACM, New York, NY, USA, 861--872. Google ScholarDigital Library
- Anna Gruebler, and Kenji Suzuki. Measurement of distal EMG signals using a wearable device for reading facial expressions. In Engineering in Medicine and Biology Society (EMBC), 2010 Annual International Conference of the IEEE, pp. 4594--4597. IEEE, 2010.Google ScholarCross Ref
- Anna Gruebler, and Kenji Suzuki. Design of a wearable device for reading positive expressions from facial emg signals. IEEE Transactions on affective computing 5, no. 3 (2014): 227--237.Google ScholarCross Ref
- Chris Harrison, Desney Tan, and Dan Morris. Skinput: appropriating the body as an input surface. In Proceedings of the SIGCHI conference on human factors in computing systems, pp. 453--462. ACM, 2010. Google ScholarDigital Library
- Claude F. Harbarger, Paul M. Weinberger, Jack C. Borders, and Charles A. Hughes. "Prenatal ultrasound exposure and association Figure 21. Interferi could be integrated into future smartwatch bands and AR/VR headset liners, as seen in these mock-ups. with postnatal hearing outcomes." Journal of Otolaryngology-Head & Neck Surgery 42, no. 1 (2013): 3.Google ScholarCross Ref
- Reli Hershkovitz, Eyal Sheiner, and Moshe Mazor. "Ultrasound in obstetrics: a review of safety." European Journal of Obstetrics & Gynecology and Reproductive Biology101, no. 1 (2002): 15--18.Google ScholarCross Ref
- Takeshi Ide, James Friend, Kentaro Nakamura, and Sadayuki Ueha. A non-contact linear bearing and actuator via ultrasonic levitation. Sensors and Actuators A: Physical 135, no. 2 (2007): 740--747.Google ScholarCross Ref
- N. Inoue, et al. A new ultrasonic interferometer for velocity measurement in liquids using optical diffraction 1986 J. Phys. D: Appl. Phys. 1Google Scholar
- Pyeong-Gook Jung, Gukchan Lim, Seonghyok Kim, and Kyoungchul Kong. A wearable gesture recognition device for detecting muscular activities based on air-pressure sensors. IEEE Transactions on Industrial Informatics 11, no. 2 (2015): 485--494.Google Scholar
- Frederic Kerber, Michael Puhl, and Antonio Krüger. 2017. User-independent real-time hand gesture recognition based on surface electromyography. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '17). ACM, New York, NY, USA, Article 36, 7 pages. Google ScholarDigital Library
- David Kim, Otmar Hilliges, Shahram Izadi, Alex D. Butler, Jiawen Chen, Iason Oikonomidis, and Patrick Olivier. 2012. Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor. In Proceedings of the 25th annual ACM symposium on User interface software and technology (UIST '12). ACM, New York, NY, USA, 167176. Google ScholarDigital Library
- Gierad Laput, Robert Xiao, and Chris Harrison. 2016. ViBand: HighFidelity Bio-Acoustic Sensing Using Commodity Smartwatch Accelerometers. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16). ACM, New York, NY, USA, 321--333. Google ScholarDigital Library
- Hao Li, Laura Trutoiu, Kyle Olszewski, Lingyu Wei, Tristan Trutna, Pei-Lun Hsieh, Aaron Nicholls, and Chongyang Ma. 2015. Facial performance sensing head-mounted display. ACM Trans. Graph. 34, 4, Article 47 (July 2015), 9 pages. Google ScholarDigital Library
- Jhe-Wei Lin, Chiuan Wang, Yi Yao Huang, Kuan-Ting Chou, HsuanYu Chen, Wei-Luan Tseng, and Mike Y. Chen. 2015. BackHand: Sensing Hand Gestures via Back of the Hand. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 557--564. Google ScholarDigital Library
- Katsutoshi Masai, Yuta Sugiura, Masa Ogata, Kai Kunze, Masahiko Inami, and Maki Sugimoto. 2016. Facial Expression Recognition in Daily Life by Embedded Photo Reflective Sensors on Smart Eyewear. In Proceedings of the 21st International Conference on Intelligent User Interfaces (IUI '16). ACM, New York, NY, USA, 317--326. Google ScholarDigital Library
- Katsutoshi Masai, Kai Kunze, Yuta Sugiura, Masa Ogata, Masahiko Inami, and Maki Sugimoto. 2017. Evaluation of Facial Expression Recognition by a Smart Eyewear for Facial Direction Changes, Repeatability, and Positional Drift. ACM Trans. Interact. Intell. Syst. 7, 4, Article 15 (December 2017), 23 pages. Google ScholarDigital Library
- Denys J. C. Matthies, Bernhard A. Strecker, and Bodo Urban. 2017. EarFieldSensing: A Novel In-Ear Electric Field Sensing to Enrich Wearable Gesture Input through Facial Expressions. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 1911--1922. Google ScholarDigital Library
- Jess McIntosh, Asier Marzo, Mike Fraser, and Carol Phillips. 2017. EchoFlex: Hand Gesture Recognition using Ultrasound Imaging. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 1923--1934. Google ScholarDigital Library
- Adiyan Mujibiya, Xiang Cao, Desney S. Tan, Dan Morris, Shwetak N. Patel, and Jun Rekimoto. 2013. The sound of touch: on-body touch and gesture sensing based on transdermal ultrasound propagation. In Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces (ITS '13). ACM, New York, NY, USA, 189--198. Google ScholarDigital Library
- Masa Ogata, and Michita Imai. SkinWatch: skin gesture interaction for smart watch. In Proceedings of the 6th Augmented Human International Conference, pp. 21--24. ACM, 2015. Google ScholarDigital Library
- Fabian Pedregosa, Gaël Varoquaux, Alexandre Gramfort, Vincent Michel, Bertrand Thirion, Olivier Grisel, Mathieu Blondel et al. Scikit-learn: Machine learning in Python. Journal of machine learning research 12, no. Oct (2011): 2825--2830. Google ScholarDigital Library
- John Kangchun Perng, Brian Fisher, Seth Hollar, and Kristofer SJ Pister. Acceleration sensing glove (ASG). In Wearable Computers, 1999. Digest of Papers. The Third International Symposium on, pp. 178--180. IEEE, 1999. Google ScholarDigital Library
- PUI Audio 40 kHz Ultrasonic Transducer, http://www.puiaudio.com/pdf/UTR-1440K-TT-R.pdfGoogle Scholar
- Jun Rekimoto. Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In Wearable Computers, 2001. Proceedings. Fifth International Symposium on, pp. 21--27. IEEE, 2001. Google ScholarDigital Library
- T. Scott Saponas, Desney S. Tan, Dan Morris, and Ravin Balakrishnan. 2008. Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08). ACM, New York, NY, USA, 515--524. Google ScholarDigital Library
- T. Scott Saponas, Desney S. Tan, Dan Morris, Ravin Balakrishnan, Jim Turner, and James A. Landay. 2009. Enabling always-available input with muscle-computer interfaces. In Proceedings of the 22nd annual ACM symposium on User interface software and technology (UIST '09). ACM, New York, NY, USA, 167--176. Google ScholarDigital Library
- Jocelyn Scheirer, Raul Fernandez, and Rosalind W. Picard. 1999. Expression glasses: a wearable device for facial expression recognition. In CHI '99 Extended Abstracts on Human Factors in Computing Systems (CHI EA '99). ACM, New York, NY, USA, 262--263. Google ScholarDigital Library
- Katsuhiro Suzuki, Fumihiko Nakamura, Jiu Otsuka, Katsutoshi Masai, Yuta Itoh, Yuta Sugiura, and Maki Sugimoto. 2016. Facial Expression Mapping inside Head Mounted Display by Embedded Optical Sensors. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16 Adjunct). ACM, New York, NY, USA, 91--92. Google ScholarDigital Library
- Teensy 3.6 Microcontroller, PJRC, https://www.pjrc.com/store/teensy36.htmlGoogle Scholar
- Thalmic Lab, Inc. http://www.thalmic.com/myo/Google Scholar
- Hsin-Ruey Tsai, Cheng-Yuan Wu, Lee-Ting Huang, and Yi-Ping Hung. 2016. ThumbRing: private interactions using one-handed thumb motion input on finger segments. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct (MobileHCI '16). ACM, New York, NY, USA, 791--798. Google ScholarDigital Library
- David Way and Joseph Paradiso. 2014. A Usability User Study Concerning Free-Hand Microgesture and Wrist-Worn Sensors. In Proceedings of the 2014 11th International Conference on Wearable and Implantable Body Sensor Networks (BSN '14). IEEE Computer Society, Washington, DC, USA, 138--142. Google ScholarDigital Library
- Eric Whitmire, Mohit Jain, Divye Jain, Greg Nelson, Ravi Karkar, Shwetak Patel, and Mayank Goel. 2017. DigiTouch: Reconfigurable Thumb-to-Finger Input and Text Entry on Head-mounted Displays. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 3, Article 113 (September 2017), 21 pages. Google ScholarDigital Library
- Chao Xu, Parth H. Pathak, and Prasant Mohapatra. Finger-writing with smartwatch: A case for finger and hand gesture recognition using smartwatch. In Proceedings of the 16th International Workshop on Mobile Computing Systems and Applications, pp. 9--14. ACM, 2015. Google ScholarDigital Library
- Yang Zhang and Chris Harrison. 2015. Tomo: Wearable, Low-Cost Electrical Impedance Tomography for Hand Gesture Recognition. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 167173. Google ScholarDigital Library
- Yang Zhang, Robert Xiao, and Chris Harrison. 2016. Advancing Hand Gesture Recognition with High Resolution Electrical Impedance Tomography. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16). ACM, New York, NY, USA, 843--850. Google ScholarDigital Library
- Cheng Zhang, AbdelKareem Bedri, Gabriel Reyes, Bailey Bercik, Omer T. Inan, Thad E. Starner, and Gregory D. Abowd. 2016. TapSkin: Recognizing On-Skin Input for Smartwatches. In Proceedings of the 2016 ACM International Conference on Interactive Surfaces and Spaces (ISS '16). ACM, New York, NY, USA, 13--22. Google ScholarDigital Library
- Cheng Zhang, Qiuyue Xue, Anandghan Waghmare, Ruichen Meng, Sumeet Jain, Yizeng Han, Xinyu Li, Kenneth Cunefare, Thomas Ploetz, Thad Starner, Omer Inan, and Gregory D. Abowd. 2018. FingerPing: Recognizing Fine-grained Hand Poses using Active Acoustic On-body Sensing. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). ACM, New York, NY, USA, Paper 437, 10 pages. Google ScholarDigital Library
Index Terms
- Interferi: Gesture Sensing using On-Body Acoustic Interferometry
Recommendations
BeamBand: Hand Gesture Sensing with Ultrasonic Beamforming
CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing SystemsBeamBand is a wrist-worn system that uses ultrasonic beamforming for hand gesture sensing. Using an array of small transducers, arranged on the wrist, we can ensem-ble acoustic wavefronts to project acoustic energy at spec-ified angles and focal ...
Hand Gesture Interaction with a Low-Resolution Infrared Image Sensor on an Inner Wrist
AVI '20: Proceedings of the International Conference on Advanced Visual InterfacesWe propose a hand gesture interaction method using a low-resolution infrared image sensor on an inner wrist. We attach the sensor to the strap of a wrist-worn device, on the palmar side, and apply machine-learning techniques to recognize the gestures ...
Itchy nose: discreet gesture interaction using EOG sensors in smart eyewear
ISWC '17: Proceedings of the 2017 ACM International Symposium on Wearable ComputersWe propose a sensing technique for detecting finger movements on the nose, using EOG sensors embedded in the frame of a pair of eyeglasses. Eyeglasses wearers can use their fingers to exert different types of movement on the nose, such as flicking, ...
Comments