This thesis represents an attempt to understand how people listen to the world, and how such an understanding can help in developing auditory interfaces for computers.
Part I of this thesis is concerned with everyday listening, the act of gaining information about events in the world by listening to the sounds they make. Chapter 1 introduces this area of research and discusses its relations with other work on audition. In Chapter 2, I suggest that the differences between cognitive and ecological perspectives on perception may be understood as stemming from contrasting views on systems, and that although these perspectives are fundamentally incompatible they may both prove valuable in understanding perception. In Chapter 3, I examine what we hear, and introduce a framework for understanding the attributes of everyday listening. Finally, in Chapter 4 I investigate the perception of a simple sonic event--struck bars of wood and metal--with the aim of discovering what information for material and length is inherent in the sounds.
Part II shows how the ideas developed in the first part may be applied to the use of sound in computer interfaces. The basic idea is that auditory icons, caricatures of everyday sounds, may be used to present information to users in a way that is analogous to the use of visual icons. This half of my thesis is made up of three papers that discuss my work on auditory icons. First is the paper that introduced this idea (Gaver, 1986), and which lay the framework for further research in this area. Second is a technical report written recently for Apple Computer, Inc. (Gaver, 1988), which is a survey of current techniques for using sound, and which explores the issues involved in creating auditory icons in more detail. Finally, I include a paper which describes the SonicFinder$\sp{\rm TM}$, a prototype auditory interface that I developed at Apple Computer, Inc. (Gaver, in press). This interface illustrates the use of auditory icons in an actual system.
Cited By
- Latupeirissa A, Panariello C and Bresin R (2023). Probing Aesthetics Strategies for Robot Sound: Complexity and Materiality in Movement Sonification, ACM Transactions on Human-Robot Interaction, 12:4, (1-22), Online publication date: 31-Dec-2024.
- Wirfs-Brock J, Fam A, Devendorf L and Keegan B (2021). Examining Narrative Sonification: Using First-Person Retrospection Methods to Translate Radio Production to Interaction Design, ACM Transactions on Computer-Human Interaction, 28:6, (1-34), Online publication date: 31-Dec-2022.
- Csapó Á and Wersényi G (2013). Overview of auditory representations in human-machine interfaces, ACM Computing Surveys, 46:2, (1-23), Online publication date: 1-Nov-2013.
- Wersényi G Auditory representations of a graphical user interface for a better human-computer interaction Proceedings of the 6th international conference on Auditory Display, (80-102)
- Oren M Speed sonic across the span CHI '07 Extended Abstracts on Human Factors in Computing Systems, (2231-2236)
- Rath M and Rohs M Explorations in sound for tilting-based interfaces Proceedings of the 8th international conference on Multimodal interfaces, (295-301)
- Fernstrom M, Brazil E and Bannon L (2005). HCI Design and Interactive Sonification for Fingers and Ears, IEEE MultiMedia, 12:2, (36-44), Online publication date: 1-Apr-2005.
- Rath M Examples and ideas in the development of sounding objects Proceedings of the 4th conference on Designing interactive systems: processes, practices, methods, and techniques, (370-372)
- van den Doel K, Kry P and Pai D FoleyAutomatic Proceedings of the 28th annual conference on Computer graphics and interactive techniques, (537-544)
- Mynatt E (2018). Transforming graphical interfaces into auditory interfaces for blind users, Human-Computer Interaction, 12:1, (7-45), Online publication date: 1-Mar-1997.
- Gaver W Synthesizing auditory icons Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems, (228-235)
- Moran T and Anderson R The workaday world as a paradigm for CSCW design Proceedings of the 1990 ACM conference on Computer-supported cooperative work, (381-393)
- Gaver W (1989). The SonicFinder, Human-Computer Interaction, 4:1, (67-94), Online publication date: 1-Mar-1989.
Recommendations
Auditory icons: using sound in computer interfaces
There is growing interest in the use of sound to convey information in computer interfaces. The strategies employed thus far have been based on an understanding of sound that leads to either an arbitrary or metaphorical relation between the sounds used ...
Synthesizing auditory icons
CHI '93: Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing SystemsAuditory icons add valuable functionality to computer interfaces, particularly when they are parameterized to convey dimensional information. They are difficult to create and manipulate, however, because they usually rely on digital sampling techniques. ...
Immersive auditory display system 'sound cask': three-dimensional sound field reproduction system based on the boundary surface control principle
VRST '18: Proceedings of the 24th ACM Symposium on Virtual Reality Software and TechnologySound cask was developed to realize the perfect 3D auditory display that creates 3D sound waves around the listener's head just the same as the primary sound field, based on the boundary surface control (BoSC) principle.
If we consider the sound ...