skip to main content
Everyday listening and auditory icons
Publisher:
  • University of California, San Diego
Order Number:AAI8908009
Pages:
278
Bibliometrics
Skip Abstract Section
Abstract

This thesis represents an attempt to understand how people listen to the world, and how such an understanding can help in developing auditory interfaces for computers.

Part I of this thesis is concerned with everyday listening, the act of gaining information about events in the world by listening to the sounds they make. Chapter 1 introduces this area of research and discusses its relations with other work on audition. In Chapter 2, I suggest that the differences between cognitive and ecological perspectives on perception may be understood as stemming from contrasting views on systems, and that although these perspectives are fundamentally incompatible they may both prove valuable in understanding perception. In Chapter 3, I examine what we hear, and introduce a framework for understanding the attributes of everyday listening. Finally, in Chapter 4 I investigate the perception of a simple sonic event--struck bars of wood and metal--with the aim of discovering what information for material and length is inherent in the sounds.

Part II shows how the ideas developed in the first part may be applied to the use of sound in computer interfaces. The basic idea is that auditory icons, caricatures of everyday sounds, may be used to present information to users in a way that is analogous to the use of visual icons. This half of my thesis is made up of three papers that discuss my work on auditory icons. First is the paper that introduced this idea (Gaver, 1986), and which lay the framework for further research in this area. Second is a technical report written recently for Apple Computer, Inc. (Gaver, 1988), which is a survey of current techniques for using sound, and which explores the issues involved in creating auditory icons in more detail. Finally, I include a paper which describes the SonicFinder$\sp{\rm TM}$, a prototype auditory interface that I developed at Apple Computer, Inc. (Gaver, in press). This interface illustrates the use of auditory icons in an actual system.

Cited By

  1. ACM
    Latupeirissa A, Panariello C and Bresin R (2023). Probing Aesthetics Strategies for Robot Sound: Complexity and Materiality in Movement Sonification, ACM Transactions on Human-Robot Interaction, 12:4, (1-22), Online publication date: 31-Dec-2024.
  2. ACM
    Wirfs-Brock J, Fam A, Devendorf L and Keegan B (2021). Examining Narrative Sonification: Using First-Person Retrospection Methods to Translate Radio Production to Interaction Design, ACM Transactions on Computer-Human Interaction, 28:6, (1-34), Online publication date: 31-Dec-2022.
  3. ACM
    Csapó Á and Wersényi G (2013). Overview of auditory representations in human-machine interfaces, ACM Computing Surveys, 46:2, (1-23), Online publication date: 1-Nov-2013.
  4. Wersényi G Auditory representations of a graphical user interface for a better human-computer interaction Proceedings of the 6th international conference on Auditory Display, (80-102)
  5. ACM
    Oren M Speed sonic across the span CHI '07 Extended Abstracts on Human Factors in Computing Systems, (2231-2236)
  6. ACM
    Rath M and Rohs M Explorations in sound for tilting-based interfaces Proceedings of the 8th international conference on Multimodal interfaces, (295-301)
  7. Fernstrom M, Brazil E and Bannon L (2005). HCI Design and Interactive Sonification for Fingers and Ears, IEEE MultiMedia, 12:2, (36-44), Online publication date: 1-Apr-2005.
  8. ACM
    Rath M Examples and ideas in the development of sounding objects Proceedings of the 4th conference on Designing interactive systems: processes, practices, methods, and techniques, (370-372)
  9. ACM
    van den Doel K, Kry P and Pai D FoleyAutomatic Proceedings of the 28th annual conference on Computer graphics and interactive techniques, (537-544)
  10. Mynatt E (2018). Transforming graphical interfaces into auditory interfaces for blind users, Human-Computer Interaction, 12:1, (7-45), Online publication date: 1-Mar-1997.
  11. ACM
    Gaver W Synthesizing auditory icons Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems, (228-235)
  12. ACM
    Moran T and Anderson R The workaday world as a paradigm for CSCW design Proceedings of the 1990 ACM conference on Computer-supported cooperative work, (381-393)
  13. Gaver W (1989). The SonicFinder, Human-Computer Interaction, 4:1, (67-94), Online publication date: 1-Mar-1989.
Contributors
  • University of Northumbria
  • University of California, San Diego

Recommendations