ABSTRACT
The main goal of this research work is to show the possibility of using sound feedback techniques to recognize shapes and gestures. The system is based on the idea of relating spatial representations to sound. The shapes are predefined and the user has no access to any visual information. The user interacts with the system using a universal pointer device, as a mouse or a pen tablet, or the touch screen of a mobile device. While exploring the space using the pointer device, sound is generated, which pitch and intensity vary according to a strategy. Sounds are related to spatial representation, so the user has a sound perception of shapes and gestures. They can be easily followed with the pointer device, using the sound as only reference.
Supplemental Material
- Buxton, W., Using Our Ears: An Introduction to the Use of Nonspeech Audio Cues. Extracting Meaning from Complex Data: Processing, Display, Interaction, edited by E.J. Farrel, Vol. 1259, SPIE 1990, p. 124--127.Google Scholar
- Donker H, Klante P, Gorny P. The design of auditory user interfaces for blind users. In Proc. of the second Nordic conference on HCI, pp. 149--156 (2002) Google ScholarDigital Library
- Grabowski,N.A., Barner, K.E. Data visualization methods for the blind using force feedback and sonification. SPIE Conference on Telemanipulator and Telepresence Technologies, 1998Google ScholarCross Ref
- IFeelPixel: Haptics & Sonification http://www.ifeelpixel.com/faq/#whatitwillGoogle Scholar
- Kamel, H. and J. Landay. Sketching images eyes-free: a grid-based dynamic drawing tool for the blind. Proc. of ACM SIGCAPH Conference on Assistive Technologies (ASSETS). pp. 33--40. (2002) Google ScholarDigital Library
- Kramer G., Walker B., Bonebright T., Cook P., Flowers J; Miner N., Neuhoff, J: Sonification Report: Status of the Field and Research Agenda. International Community for Auditory Display, www.icad.org (1997)Google Scholar
- Krueger, M. KnowWare": Virtual Reality Maps for Blind People. SBIR Phase I Final Report, NIH Grant #1 R43 EY11075-01, (1996)Google Scholar
- Loomis, J.M., Reginald G. G., Roberta L. K. Navigation System for the Blind: Auditory Display Modes and Guidance. Presence,V.7,N.2,193--203(1998) Google ScholarDigital Library
- Mynatt, E. & G. Weber. Nonvisual Presentation of Graphical User Interfaces: Contrasting Two Approaches. In Proc. of the Computer, CHI 94. (1994) Google ScholarDigital Library
- Parente, P. and G. Bishop. BATS: The Blind Audio Tactile Mapping System. ACMSE. (2003)Google Scholar
- Yu, W., Kuber R., Murphy E., Strain P., McAllister G. A Novel Multimodal Interface for Improving Visually Impaired People's Web Accessibility. Virtual Reality, Vol 9: 133--148 (2006) Google ScholarDigital Library
Index Terms
- Recognizing shapes and gestures using sound as feedback
Recommendations
Evaluation of psychoacoustic sound parameters for sonification
ICMI '17: Proceedings of the 19th ACM International Conference on Multimodal InteractionSonification designers have little theory or experimental evidence to guide the design of data-to-sound mappings. Many mappings use acoustic representations of data values which do not correspond with the listener's perception of how that data value ...
Lyricon Lyrics + Earcons Improves Identification of Auditory Cues
Proceedings, Part II, of the 4th International Conference on Design, User Experience, and Usability: Users and Interactions - Volume 9187Auditory researchers have developed various non-speech cues in designing auditory user interfaces. A preliminary study of "lyricons" lyricsï ź+ï źearcons [1] has provided a novel approach to devising auditory cues in electronic products, by combining ...
A sound design for acoustic feedback in elite sports
CMMR/ICAD'09: Proceedings of the 6th international conference on Auditory DisplaySound (acoustic information) is the naturally evocative, audible result of kinetic events. Humans interact with the world by the everyday experience of listening to perceive and interpret the environment. Elite athletes, especially, rely on sport ...
Comments