skip to main content
research-article

The Role of Sound Source Perception in Gestural Sound Description

Published:01 April 2014Publication History
Skip Abstract Section

Abstract

We investigated gesture description of sound stimuli performed during a listening task. Our hypothesis is that the strategies in gestural responses depend on the level of identification of the sound source and specifically on the identification of the action causing the sound. To validate our hypothesis, we conducted two experiments. In the first experiment, we built two corpora of sounds. The first corpus contains sounds with identifiable causal actions. The second contains sounds for which no causal actions could be identified. These corpora properties were validated through a listening test. In the second experiment, participants performed arm and hand gestures synchronously while listening to sounds taken from these corpora. Afterward, we conducted interviews asking participants to verbalize their experience while watching their own video recordings. They were questioned on their perception of the listened sounds and on their gestural strategies. We showed that for the sounds where causal action can be identified, participants mainly mimic the action that has produced the sound. In the other case, when no action can be associated with the sound, participants trace contours related to sound acoustic features. We also found that the interparticipants’ gesture variability is higher for causal sounds compared to noncausal sounds. Variability demonstrates that, in the first case, participants have several ways of producing the same action, whereas in the second case, the sound features tend to make the gesture responses consistent.

References

  1. J. A. Ballas. 1993. Common factors in the identification of an assortment of brief everyday sounds. Journal of Experimental Psychology: Human Perception and Performance 19, 2 (1993), 250--267.Google ScholarGoogle ScholarCross RefCross Ref
  2. Baptiste Caramiaux. 2012. Studies on the Relationship between Gesture and Sound in Musical Performance. Ph.D. Thesis. Université Pierre et Marie Curie (Paris 6).Google ScholarGoogle Scholar
  3. B. Caramiaux, F. Bevilacqua, and N. Schnell. 2010a. Mimicking Sound with Gesture as Interaction Paradigm. Technical Report. IRCAM - Centre Pompidou.Google ScholarGoogle Scholar
  4. B. Caramiaux, F. Bevilacqua, and N. Schnell. 2010b. Towards a gesture-sound cross-modal analysis. In Embodied Communication and Human-Computer Interaction, volume 5934 of Lecture Notes in Computer Science. Springer Verlag, 158--170. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. W. W. Gaver. 1993a. How do we hear in the world? Explorations in ecological acoustics. Ecological Psychology 5, 4 (1993), 285--313.Google ScholarGoogle ScholarCross RefCross Ref
  6. W. W. Gaver. 1993b. What in the world do we hear?: An ecological approach to auditory event perception. Ecological Psychology 5, 1 (1993), 1--29.Google ScholarGoogle ScholarCross RefCross Ref
  7. Y. Gérard. 2004. Mémoire sémantique et sons de l’environnement. Ph.D. Dissertation. Université de Bourgogne.Google ScholarGoogle Scholar
  8. B. L. Giordano, J. McDonnell, and S. McAdams. 2010. Hearing living symbols and nonliving icons: Category specificities in the cognitive processing of environmental sounds. Brain and Cognition 73, 1 (2010), 7--19.Google ScholarGoogle ScholarCross RefCross Ref
  9. R. I. Godøy. 2006. Gestural-sonorous objects: embodied extensions of Schaeffer’s conceptual apparatus. Organised Sound 11, 2 (2006), 149--157. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. R. I. Godøy, E. Haga, and A. R. Jensenius. 2006a. Exploring music-related gestures by sound-tracing: A preliminary study. In Proceedings of the COST287-ConGAS 2nd International Symposium on Gesture Interfaces for Multimedia Systems (GIMS’06).Google ScholarGoogle Scholar
  11. R. I. Godøy, E. Haga, and A. R. Jensenius. 2006b. Playing “air instruments”: Mimicry of sound-producing gestures by novices and experts. In Lecture Notes in Computer Science. Springer-Verlag.Google ScholarGoogle Scholar
  12. C. Guastavino. 2007. Categorization of environmental sounds. Canadian Journal of Experimental Psychology/Revue canadienne de psychologie expérimentale 61, 1 (2007), 54.Google ScholarGoogle Scholar
  13. B. Gygi, G. R. Kidd, and C. S. Watson. 2004. Spectral-temporal factors in the identification of environmental sounds. The Journal of the Acoustical Society of America 115 (2004), 1252.Google ScholarGoogle ScholarCross RefCross Ref
  14. B. Gygi, G. R. Kidd, and C. S. Watson. 2007. Similarity and categorization of environmental sounds. Attention, Perception, & Psychophysics 69, 6 (2007), 839--855.Google ScholarGoogle ScholarCross RefCross Ref
  15. O. Houix, G. Lemaitre, N. Misdariis, P. Susini, and I. Urdapilleta. 2012. A lexical analysis of environmental sound categories. Journal of Experimental Psychology: Applied 18, 1 (2012), 52--80.Google ScholarGoogle ScholarCross RefCross Ref
  16. A. Kendon. 2004. Gesture: Visible action as utterance. Cambridge University Press.Google ScholarGoogle ScholarCross RefCross Ref
  17. E. Kohler, C. Keysers, M. A. Umilta, L. Fogassi, V. Gallese, and G. Rizzolatti. 2002. Hearing sounds, understanding actions: action representation in mirror neurons. Science 297, 5582 (2002), 846.Google ScholarGoogle Scholar
  18. E. W. Large. 2000. On synchronizing movements to music. Human Movement Science 19, 4 (2000), 527--566.Google ScholarGoogle ScholarCross RefCross Ref
  19. E. W. Large and C. Palmer. 2002. Perceiving temporal regularity in music. Cognitive Science 26, 1 (2002), 1--37.Google ScholarGoogle ScholarCross RefCross Ref
  20. G. Lemaitre and L. M. Heller. 2012. Auditory perception of material is fragile while action is strikingly robust. The Journal of the Acoustical Society of America 131 (2012), 1337.Google ScholarGoogle ScholarCross RefCross Ref
  21. G. Lemaitre, O. Houix, N. Misdariis, and P. Susini. 2010. Listener expertise and sound identification influence the categorization of environmental sounds. Journal of Experimental Psychology: Applied 16, 1 (2010), 16--32.Google ScholarGoogle ScholarCross RefCross Ref
  22. M. Leman. 2007. Embodied Music Cognition and Mediation Technology. Massachusetts Institute of Technology Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. M. Leman, F. Desmet, F. Styns, L. Van Noorden, and D. Moelants. 2009. Sharing musical expression through embodied listening: A case study based on Chinese Guqin music. Music Perception 26, 3 (2009), 263--278.Google ScholarGoogle ScholarCross RefCross Ref
  24. J. W. Lewis. 2004. Human brain regions involved in recognizing environmental sounds. Cerebral Cortex 14, 9 (2004), 1008--1021. DOI: http://dx.doi.org/10.1093/cercor/bhh061Google ScholarGoogle ScholarCross RefCross Ref
  25. J. D. Loehr and C. Palmer. 2007. Cognitive and biomechanical influences in pianists finger tapping. Experimental brain research 178, 4 (2007), 518--528.Google ScholarGoogle Scholar
  26. X. Ma, C. Fellbaum, and P. R. Cook. 2010. SoundNet: Investigating a language composed of environmental sounds. In Proceedings of the 28th International Conference on Human Factors in Computing Systems. ACM, 1945--1954. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. P.-J. Maes. 2013. An Empirical Study of Embodied Music Listening, and Its Applications in Mediation Technology. Ph.D. Dissertation. Ghent University.Google ScholarGoogle Scholar
  28. M. M. Marcell, D. Borella, M. Greene, E. Kerr, and S. Rogers. 2000. Confrontation naming of environmental sounds. Journal of Clinical and Experimental Neuropsychology 22, 6 (2000), 830--864.Google ScholarGoogle ScholarCross RefCross Ref
  29. S. McAdams. 1993. Recognition of sound sources and events. In Thinking in Sound: The Cognitive Psychology of Human Audition. Oxford University Press, 1993.Google ScholarGoogle ScholarCross RefCross Ref
  30. D. McNeill. 1996. Hand and Mind: What Gestures Reveal about Thought. University of Chicago Press.Google ScholarGoogle Scholar
  31. K. Nymoen, B. Caramiaux, M. Kozak, and J. Tørresen. 2011. Analyzing sound tracings - A multimodal approach to music information retrieval. In ACM Multimedia -- MIRUM 2011 (accepted). Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. K. Nymoen, K. Glette, S. Skogstad, J. Torresen, and A. R. Jensenius. 2010. Searching for cross-individual relationships between sound and movement features using an SVM classifier. In Proceedings of the Conference on New Interfaces for Musical Expression (NIME’10).Google ScholarGoogle Scholar
  33. K. Nymoen, R. I. Godøy, A. R. Jensenius, and J. Torresen. 2013. Analyzing correspondence between sound objects and body motion. ACM Transactions on Applied Perception (TAP) 10, 2 (2013), 9. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. L. Pizzamiglio, T. Aprile, G. Spitoni, S. Pitzalis, E. Bates, S. D’Amico, and F. Di Russo. 2005. Separate neural systems for processing action-or non-action-related sounds. Neuroimage 24, 3 (2005), 852--861.Google ScholarGoogle ScholarCross RefCross Ref
  35. J. O. Ramsay and B. W. Silverman. 1997. Functional Data Analysis. 2nd edition, Springer Science.Google ScholarGoogle Scholar
  36. G. P. Scavone, S. Lakatos, and C. R. Harbke. 2002. The Sonic Mapper: An interactive program for obtaining similarity ratings with auditory stimuli. In Proceedings of the International Conference on Auditory Display.Google ScholarGoogle Scholar
  37. P. Schaeffer. 1966. Traité des Objets Musicaux. Éditions du Seuil.Google ScholarGoogle Scholar
  38. V. Shafiro. 2008. Identification of environmental sounds with varying spectral resolution. Ear and Hearing 29, 3 (2008), 401.Google ScholarGoogle ScholarCross RefCross Ref
  39. D. Smalley. 1997. Spectromorphology: explaining sound-shapes. Organised Sound 2, 2 (1997), 107--126. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. S. S. Stevens, J. Volkmann, and E. B. Newman. 1937. A scale for the measurement of the psychological magnitude pitch. The Journal of the Acoustical Society of America 8, 3 (1937), 185--190.Google ScholarGoogle ScholarCross RefCross Ref
  41. J. Tardieu, P. Susini, F. Poisson, H. Kawakami, and S. McAdams. 2009. The design and evaluation of an auditory way-finding system in a train station. Applied Acoustics 70, 9 (2009), 1183--1193.Google ScholarGoogle ScholarCross RefCross Ref
  42. N. J. VanDerveer. 1980. Ecological Acoustics: Human Perception of Environmental Sounds. Ph.D. Dissertation. ProQuest Information & Learning.Google ScholarGoogle Scholar
  43. P. Vermersch. 1990. Questionner l’action: l’entretien d’explicitation. Psychologie française 35, 3 (1990), 227--235.Google ScholarGoogle Scholar
  44. M. M. Wanderley and P. Depalle. 2004. Gestural control of sound synthesis. Proceedings of the IEEE 92, 4 (2004), 632--644.Google ScholarGoogle ScholarCross RefCross Ref
  45. R. J. Zatorre, J. L. Chen, and V. B. Penhune. 2007. When the brain plays music: auditory--motor interactions in music perception and production. Nature Reviews Neuroscience 8, 7 (2007), 547--558.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. The Role of Sound Source Perception in Gestural Sound Description

              Recommendations

              Comments

              Login options

              Check if you have access through your login credentials or your institution to get full access on this article.

              Sign in

              Full Access

              • Published in

                cover image ACM Transactions on Applied Perception
                ACM Transactions on Applied Perception  Volume 11, Issue 1
                April 2014
                69 pages
                ISSN:1544-3558
                EISSN:1544-3965
                DOI:10.1145/2611125
                Issue’s Table of Contents

                Copyright © 2014 ACM

                Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

                Publisher

                Association for Computing Machinery

                New York, NY, United States

                Publication History

                • Published: 1 April 2014
                • Accepted: 1 October 2013
                • Revised: 1 July 2013
                • Received: 1 November 2012
                Published in tap Volume 11, Issue 1

                Permissions

                Request permissions about this article.

                Request Permissions

                Check for updates

                Qualifiers

                • research-article
                • Research
                • Refereed

              PDF Format

              View or Download as a PDF file.

              PDF

              eReader

              View online with eReader.

              eReader