skip to main content
10.1145/1753846.1753918acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
extended-abstract

Recognizing shapes and gestures using sound as feedback

Published:10 April 2010Publication History

ABSTRACT

The main goal of this research work is to show the possibility of using sound feedback techniques to recognize shapes and gestures. The system is based on the idea of relating spatial representations to sound. The shapes are predefined and the user has no access to any visual information. The user interacts with the system using a universal pointer device, as a mouse or a pen tablet, or the touch screen of a mobile device. While exploring the space using the pointer device, sound is generated, which pitch and intensity vary according to a strategy. Sounds are related to spatial representation, so the user has a sound perception of shapes and gestures. They can be easily followed with the pointer device, using the sound as only reference.

Skip Supplemental Material Section

Supplemental Material

p3063.mov

mov

44.2 MB

References

  1. Buxton, W., Using Our Ears: An Introduction to the Use of Nonspeech Audio Cues. Extracting Meaning from Complex Data: Processing, Display, Interaction, edited by E.J. Farrel, Vol. 1259, SPIE 1990, p. 124--127.Google ScholarGoogle Scholar
  2. Donker H, Klante P, Gorny P. The design of auditory user interfaces for blind users. In Proc. of the second Nordic conference on HCI, pp. 149--156 (2002) Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Grabowski,N.A., Barner, K.E. Data visualization methods for the blind using force feedback and sonification. SPIE Conference on Telemanipulator and Telepresence Technologies, 1998Google ScholarGoogle ScholarCross RefCross Ref
  4. IFeelPixel: Haptics & Sonification http://www.ifeelpixel.com/faq/#whatitwillGoogle ScholarGoogle Scholar
  5. Kamel, H. and J. Landay. Sketching images eyes-free: a grid-based dynamic drawing tool for the blind. Proc. of ACM SIGCAPH Conference on Assistive Technologies (ASSETS). pp. 33--40. (2002) Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Kramer G., Walker B., Bonebright T., Cook P., Flowers J; Miner N., Neuhoff, J: Sonification Report: Status of the Field and Research Agenda. International Community for Auditory Display, www.icad.org (1997)Google ScholarGoogle Scholar
  7. Krueger, M. KnowWare": Virtual Reality Maps for Blind People. SBIR Phase I Final Report, NIH Grant #1 R43 EY11075-01, (1996)Google ScholarGoogle Scholar
  8. Loomis, J.M., Reginald G. G., Roberta L. K. Navigation System for the Blind: Auditory Display Modes and Guidance. Presence,V.7,N.2,193--203(1998) Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Mynatt, E. & G. Weber. Nonvisual Presentation of Graphical User Interfaces: Contrasting Two Approaches. In Proc. of the Computer, CHI 94. (1994) Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Parente, P. and G. Bishop. BATS: The Blind Audio Tactile Mapping System. ACMSE. (2003)Google ScholarGoogle Scholar
  11. Yu, W., Kuber R., Murphy E., Strain P., McAllister G. A Novel Multimodal Interface for Improving Visually Impaired People's Web Accessibility. Virtual Reality, Vol 9: 133--148 (2006) Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Recognizing shapes and gestures using sound as feedback

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader