ABSTRACT
We present Bitey, a subtle, wearable device for enabling input via tooth clicks. Based on a bone-conduction microphone worn just above the ears, Bitey recognizes the click sounds from up to five different pairs of teeth, allowing fully hands-free interface control. We explore the space of tooth input and show that Bitey allows for a high degree of accuracy in distinguishing between different tooth clicks, with up to 94% accuracy under laboratory conditions for five different tooth pairs. Finally, we illustrate Bitey's potential through two demonstration applications: a list navigation and selection interface and a keyboard input method.
- Brian Amento, Will Hill, and Loren Terveen. 2002. The sound of one hand: a wrist-mounted bio-acoustic fingertip gesture interface. In CHI EA '02: CHI '02 Extended Abstracts on Human Factors in Computing Systems. ACM, New York, New York, USA, 724--725. Google ScholarDigital Library
- Oliver Amft, Martin Kusserow, and G. Tröster. 2009. Bite Weight Prediction From Acoustic Recognition of Chewing. Biomedical Engineering, IEEE Transactions on 56, 6 (June 2009), 1663--1672.Google Scholar
- Oliver Amft, Mathias Stäger, Paul Lukowicz, and Gerhard Tröster. 2005. Analysis of Chewing Sounds for Dietary Monitoring. In UbiComp 2005: Ubiquitous Computing. Springer Berlin Heidelberg, Berlin, Heidelberg, 56--72. Google ScholarDigital Library
- Daniel Ashbrook. 2009. Enabling Mobile Microinteractions. Ph.D. Dissertation. PhD Thesis, Georgia Tech, Georgia Institute of Technology. Google ScholarDigital Library
- Daniel Ashbrook, Patrick Baudisch, and Sean White. 2011. Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring. In CHI '11: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York, New York, USA, 2043--2046. Google ScholarDigital Library
- William Biederman. 1962. Etiology and treatment of tooth ankylosis. American Journal of Orthodontics 48, 9 (Sept. 1962), 670--684.Google ScholarCross Ref
- Gábor Balázs Blaskó. 2007. Cursorless Interaction Techniques for Wearable and Mobile Computing. Ph.D. Dissertation. Columbia University.Google Scholar
- H. S. Brenman. 1974. Gnathosonics and occlusion. Frontiers of oral physiology 1 (1974), 238--256.Google Scholar
- H. S. Brenman and James S. Millsap. 1959. A "Sound" Approach to Occlusion. The bulletin of the Philadelphia County Dental Society 24 (1959), 4--8.Google Scholar
- H. S. Brenman, R. C. Weiss, and M. Black. 1966. Sound as a diagnostic aid in the detection of occlusal discrepancies. The Penn-Dental Journal 69, 2 (1966), 33--49.Google Scholar
- C. A. Chin, A. Barreto, and J. G. Cremades. 2008. Integrated electromyogram and eye-gaze tracking cursor control system for computer users with motor disabilities. Journal of Rehabilitation Research and Development 45, 1 (2008), 161--174.Google ScholarCross Ref
- Travis Deyle, S. Palinko, E. S. Poole, and T. Starner. 2007. Hambone: A Bio-Acoustic Gesture Interface. In Wearable Computers, 2007 11th IEEE International Symposium on. IEEE, 3--10. Google ScholarDigital Library
- David J. Fuller and Victor C. West. 1987. The tooth contact sound as an analogue of the "quality of occlusion". The Journal of Prosthetic Dentistry 57, 2 (Feb. 1987), 236--243.Google ScholarCross Ref
- Mayank Goel, Chen Zhao, Ruth Vinisha, and Shwetak N. Patel. 2015. Tongue-in-Cheek: Using Wireless Signals to Enable Non-Intrusive and Flexible Facial Gestures Detection. In CHI '16: Proceedings of the 34th Annual ACM Conference on Human Factors in Computing Systems. New York, New York, USA, 255--258. Google ScholarDigital Library
- Sean Gustafson, Daniel Bierwirth, and Patrick Baudisch. 2010. Imaginary interfaces: spatial interaction with empty hands and without visual feedback. In Proceedings of the 23nd annual ACM symposium on User interface software and technology. New York, New York, USA, 3. Google ScholarDigital Library
- Chris Harrison, Desney Tan, and Dan Morris. 2010. Skinput: appropriating the body as an input surface. In CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York, New York, USA, 453--462. Google ScholarDigital Library
- Hȩdzelek and Hornowski. 1998. The analysis of frequency of occlusal sounds in patients with periodontal diseases and gnathic dysfunction. Journal of Oral Rehabilitation 25, 2 (Feb. 1998), 139--145.Google Scholar
- David A. Huffman. 1952. A method for the construction of minimum-redundancy codes. Proceedings of the I.R.E. 40, 9 (1952), 1098--1101.Google ScholarCross Ref
- ISO. 2009. Dentistry---Designation system for teeth and areas of the oral cavity. ISO 3950:2009. International Organization for Standardization, Geneva, Switzerland.Google Scholar
- Krishan K. Kapur. 1971. Frequency Spectrographic Analysis of Bone Conducted Chewing Sounds in Persons With Natural and Artificial Dentitions. Journal of Texture Studies 2 (1971), 50--61.Google ScholarCross Ref
- Koichi Kuzume. 2008. A Character Input System Using Tooth-Touch Sound for Disabled People. In International Conference on Computers Helping People with Special Needs. Springer Berlin Heidelberg, Berlin, Heidelberg, 1157--1160. Google ScholarDigital Library
- Koichi Kuzume. 2011. Tooth-touch Sound and Expiration Signal Detection and Its Application in a Mouse Interface Device for Disabled Persons: Realization of a Mouse Interface Device Driven by Biomedical Signals. In International Conference on Pervasive and Embedded Computing and Communication Systems. SciTePress - Science and and Technology Publications, 15--21.Google Scholar
- Koichi Kuzume. 2012. Evaluation of tooth-touch sound and expiration based mouse device for disabled persons. In 2012 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops). IEEE, 387--390.Google ScholarCross Ref
- Koichi Kuzume and T. Morimoto. 2006. Hands-free man-machine interface device using tooth-touch sound for disabled persons. In Proceedings of the 6th International Confernece on Disability, Virtual Reality and Associated Technology. 147--152.Google Scholar
- Peter R. L'Estrange, Alan R. Blowers, Robert G. Carlyon, and Stig L. Karlsson. 1993. A microcomputer system for physiological data collection and analysis. Australian Dental Journal 38, 5 (Oct. 1993), 400--405.Google ScholarCross Ref
- Cheng-Yuan Li, Yen-Chang Chen, Wei-Ju Chen, Polly Huang, and Hao-hua Chu. 2013. Sensor-embedded teeth for oral activity recognition. In the 17th annual international symposium. ACM Press, New York, New York, USA, 41. Google ScholarDigital Library
- Zheng Li, Ryan Robucci, Nilanjan Banerjee, and Chintan Patel. 2015. Tongue-n-cheek: non-contact tongue gesture recognition. In IPSN '15: Proceedings of the 14th International Conference on Information Processing in Sensor Networks. New York, New York, USA, 95--105. Google ScholarDigital Library
- Zicheng Liu, Amar Subramanya, Zhengyou Zhang, Jasha Droppo, and Alex Acero. 2005. Leakage Model and Teeth Clack Removal for Air- and Bone-Conductive Integrated Microphones. (ICASSP '05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005. 1 (2005), 1093--1096.Google Scholar
- I. S. MacKenzie, R. W. Soukoreff, and J. Helga. 2011. 1 thumb, 4 buttons, 20 words per minute: Design and evaluation of H4-Writer. In Proceedings of the 24th annual ACM symposium on User interface software and technology. Google ScholarDigital Library
- W. D. Mccall Jr., Antje Tallgren, and M. M Ash Jr. 1979. EMG Silent Periods in Immediate Complete Denture Patients: A Longitudinal Study. Journal of Dental Research 58, 12 (Dec. 1979), 2353--2359.Google ScholarCross Ref
- Tamer Mohamed and Lin Zhong. 2006. TeethClick: Input with Teeth Clacks. Technical Report. Rice University.Google Scholar
- Alexander Ng, Stephen A Brewster, and John Williamson. 2013. The Impact of Encumbrance on Mobile Interactions. In Proceedings of The International Symposium on Open Collaboration. Springer Berlin Heidelberg, Berlin, Heidelberg, 92--109.Google ScholarCross Ref
- Ian Oakley, Doyoung Lee, MD Rasel Islam, and Augusto Esteves. 2015. Beats: Tapping Gestures for Smart Watches. ACM, New York, New York, USA.Google Scholar
- Jerome Pasquero, Scott J. Stobbe, and Noel Stonehouse. 2011. A haptic wristwatch for eyes-free interactions. In CHI '11: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York, New York, USA, 3257. Google ScholarDigital Library
- J. F. Prinz. 2000. Computer aided gnathosonic analysis: distinguishing between single and multiple tooth impact sounds. Journal of Oral Rehabilitation 27 (2000), 682--689.Google ScholarCross Ref
- A. Prochazka. 2005. Method and apparatus for controlling a device or process with vibrations generated by tooth clicks. (Nov. 1 2005). US Patent 6,961,623.Google Scholar
- Tauhidur Rahman, Alexander T. Adams, Mi Zhang, Erin Cherry, Bobby Zhou, Huaishu Peng, and Tanzeem Choudhury. 2014. BodyBeat: a mobile system for sensing non-speech body sounds. In MobiSys '14: Proceedings of the 12th annual international conference on Mobile systems, applications, and services. New York, New York, USA, 2--13. Google ScholarDigital Library
- Himanshu Sahni, Abdelkareem Bedri, Gabriel Reyes, Pavleen Thukral, Zehua Guo, Thad Starner, and Maysam Ghovanloo. 2014. The tongue and ear interface: a wearable system for silent speech recognition. In ISWC '14: Proceedings of the 2014 ACM International Symposium on Wearable Computers. New York, New York, USA, 47--54. Google ScholarDigital Library
- A. Sears, M. Lin, J. Jacko, and Y. Xiao. 2003. When computers fade... Pervasive computing and situationally-induced impairments and disabilities. In International Conference on Human Computer Interaction. HCI International.Google Scholar
- C. S. SHI and Y. MAO. 1993. Elementary identification of a gnathosonic classification using an autoregressive model. Journal of Oral Rehabilitation 20, 4 (July 1993), 373--378.Google Scholar
- Chong-Shan Shi, Guan Ouyang, and Tian-wen Guo. 1991. Power spectral analysis of occlusal sounds of natural dentition subjects. Journal of Oral Rehabilitation 18, 3 (May 1991), 273--277.Google Scholar
- Tyler Simpson, Colin Broughton, Michel J. A. Gauthier, and Arthur Prochazka. 2008. Tooth-Click Control of a Hands-Free Computer Interface. Biomedical Engineering, IEEE Transactions on 55, 8 (Aug. 2008), 2050--2056.Google Scholar
- Tyler Simpson, Michel Gauthier, and Arthur Prochazka. 2010. Evaluation of Tooth-Click Triggering and Speech Recognition in Assistive Technology for Computer Access. Neurorehabilitation and Neural Repair 24, 2 (Feb. 2010), 188--194.Google ScholarCross Ref
- J. M. Stewart. 1953. Diagnosis of Traumatic Occlusion. The Journal of the Florida State Dental Society 24 (Oct. 1953), 4--9.Google Scholar
- H. N. Teodorescu, V. Burlui, and P. D. Leca. 1988. Gnathosonic analyser. Medical and Biological Engineering and Computing 26, 4 (July 1988), 428--431.Google ScholarCross Ref
- Outi Tuisku, Veikko Surakka, Toni Vanhala, Ville Rantanen, and Jukka Lekkala. 2012. Wireless Face Interface: Using voluntary gaze direction and facial muscle activations for human--computer interaction. Interacting with Computers 24, 1 (Jan. 2012), 1--9. Google ScholarDigital Library
- K. W. Tyson. 1998. Monitoring the state of the occlusion -- gnathosonics can be reliable. Journal of Oral Rehabilitation 25, 5 (May 1998), 395--402.Google ScholarCross Ref
- David M. Watt. 1963. A preliminary report on the auscultation of the masticatory mechanism. Dental Practitioner 14 (Sept. 1963), 27--30.Google Scholar
- David M. Watt. 1966. Gnathosonics---A study of sounds produced by the masticatory mechanism. The Journal of Prosthetic Dentistry 16, 1 (Jan. 1966), 73--82.Google ScholarCross Ref
- David M. Watt. 1969. Recording the sounds of tooth contact: a diagnostic technique for evaluation of occlusal disturbances. International Dental Journal 2 (June 1969), 221--238.Google Scholar
- David M. Watt. 1970. Use of sound in oral diagnosis. Proceedings of the Royal Society of Medicine 63, 8 (Aug. 1970), 793.Google Scholar
- David M. Watt. 1981. Gnathosonic Diagnosis and Occlusal Dynamics. Praeger Publishers.Google Scholar
- Koji Yatani and Khai N. Truong. 2012. BodyScope: a wearable acoustic sensor for activity recognition. In UbiComp '12: Proceedings of the 2012 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, New York, New York, USA, 341--350. Google ScholarDigital Library
- Shengdong Zhao, Pierre Dragicevic, Mark Chignell, Ravin Balakrishnan, and Patrick Baudisch. 2007. Earpod: eyes-free menu selection using touch input and reactive audio feedback. CHI '07: Proceedings of the SIGCHI conference on Human factors in computing systems (April 2007), 1395--1404. Google ScholarDigital Library
- Xiaoyu Amy Zhao, Elias D. Guestrin, Dimitry Sayenko, Tyler Simpson, Michel Gauthier, and Milos R. Popovic. 2012. Typing with eye-gaze and tooth-clicks. In ETRA '12: Proceedings of the Symposium on Eye Tracking Research and Applications. New York, New York, USA, 341. Google ScholarDigital Library
- Lin Zhong, Dania El-Daye, Brett Kaufman, Nick Tobaoda, Tamer Mohamed, and Michael Liebschner. 2007. OsteoConduct: wireless body-area communication based on bone conduction. In ICST 2nd international conference on Body area networks. ICST. Google ScholarDigital Library
Index Terms
- Bitey: an exploration of tooth click gestures for hands-free user interface control
Recommendations
CanalSense: Face-Related Movement Recognition System based on Sensing Air Pressure in Ear Canals
UIST '17: Proceedings of the 30th Annual ACM Symposium on User Interface Software and TechnologyWe present a jaw, face, or head movement (face-related movement) recognition system called CanalSense. It recognizes face-related movements using barometers embedded in earphones. We find that face-related movements change air pressure inside the ear ...
Skinput: appropriating the body as an input surface
CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsWe present Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. In particular, we resolve the location of finger taps on the arm and hand by analyzing mechanical vibrations ...
A gaze interactive textual smartwatch interface
UbiComp/ISWC'15 Adjunct: Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable ComputersMobile gaze interaction is challenged by inherent motor noise. We examined the gaze tracking accuracy and precision of twelve subjects wearing a gaze tracker on their wrist while standing and walking. Results suggest that it will be possible to detect ...
Comments