skip to main content
research-article

Affective Calibration of Musical Feature Sets in an Emotionally Intelligent Music Composition System

Published:10 May 2017Publication History
Skip Abstract Section

Abstract

Affectively driven algorithmic composition (AAC) is a rapidly growing field that exploits computer-aided composition in order to generate new music with particular emotional qualities or affective intentions. An AAC system was devised in order to generate a stimulus set covering nine discrete sectors of a two-dimensional emotion space by means of a 16-channel feed-forward artificial neural network. This system was used to generate a stimulus set of short pieces of music, which were rendered using a sampled piano timbre and evaluated by a group of experienced listeners who ascribed a two-dimensional valence-arousal coordinate to each stimulus. The underlying musical feature set, initially drawn from the literature, was subsequently adjusted by amplifying or attenuating the quantity of each feature in order to maximize the spread of stimuli in the valence-arousal space before a second listener evaluation was conducted. This process was repeated a third time in order to maximize the spread of valence-arousal coordinates ascribed to the generated stimulus set in comparison to a spread taken from an existing prerated database of stimuli, demonstrating that this prototype AAC system is capable of creating short sequences of music with a slight improvement on the range of emotion found in a stimulus set comprised of real-world, traditionally composed musical excerpts.

References

  1. O. Bown and S. Lexer. 2006. Continuous-time recurrent neural networks for generative and interactive musical performance. In Applications of Evolutionary Computing. Springer, 652--663. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. M. M. Bradley and P. J. Lang. 1994. Measuring emotion: the self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 25, 1 (1994), 49--59. Google ScholarGoogle ScholarCross RefCross Ref
  3. R. Bresin. 1998. Artificial neural networks based models for automatic performance of musical scores. J. New Music Res. 27, 3 (1998), 239--270. Google ScholarGoogle ScholarCross RefCross Ref
  4. R. Bresin and A. Friberg. 2011. Emotion rendering in music: Range and characteristic values of seven musical variables. Cortex 47, 9 (Oct. 2011), 1068--1081. Google ScholarGoogle ScholarCross RefCross Ref
  5. E. Brown and P. Cairns. 2004. A grounded investigation of game immersion. In CHI’04 Extended Abstracts on Human Factors in Computing Systems. ACM, 1297--1300. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. G. A. Carpenter and S. Grossberg. 1992. A self-organizing neural network for supervised learning, recognition, and prediction. IEEE Commun. Mag. 30, 9 (1992), 38--49. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. M. Casey. 2001. General sound classification and similarity in MPEG-7. Organised Sound 6, 2 (Aug. 2001), 153--164. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. P. Dahlstedt. 2001. A mutasynth in parameter space: Interactive composition through evolution. Organised Sound 6, 2 (Aug. 2001), 121--124. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. B. Den Brinker, R. Van Dinther, and J. Skowronek. 2012. Expressed music mood classification compared with valence and arousal ratings. EURASIP J. Audio Speech Music Process. 2012, 1 (2012), 1--14. Google ScholarGoogle ScholarCross RefCross Ref
  10. J. Eaton, D. Williams, and E. Miranda. 2014. Affective jukebox: A confirmatory study of EEG emotional correlates in response to musical stimuli. In Proceedings of the Joint ICMC, SMC 2014, 11th Sound and Music Conference, and 14th International Computer Music Conference. University of Athens, Greece.Google ScholarGoogle Scholar
  11. T. Eerola and J. K. Vuoskoski. 2010. A comparison of the discrete and dimensional models of emotion in music. Psychol. Music 39, 1 (Aug. 2010), 18--49. Google ScholarGoogle ScholarCross RefCross Ref
  12. H. Egermann and S. McAdams. 2013. Empathy and emotional contagion as a link between recognized and felt emotions in music listening. Music Percept. Interdiscip. J. 31, 2 (2013), 139--156. Google ScholarGoogle ScholarCross RefCross Ref
  13. A. Eigenfeldt. 2011. Real-time composition as performance ecosystem. Organised Sound 16, 2 (Aug. 2011), 145--153. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. A. Gabrielsson. 2002. Emotion perceived and emotion felt: Same or different? Music. Sci. 5, 1 suppl (2002), 123--147.Google ScholarGoogle Scholar
  15. A. Gabrielsson and E. Lindström. 2001. The influence of musical structure on emotional expression. In Music and Emotion:Theory and Research. Series in affective science. P. N. Juslin, J. A. Sloboda, ed. Oxford University Press, New York, 223--248.Google ScholarGoogle Scholar
  16. P. Gomez and B. Danuser. 2007. Relationships between musical structure and psychophysiological measures of emotion. Emotion 7, 2 (2007), 377.Google ScholarGoogle ScholarCross RefCross Ref
  17. O. Grewe, F. Nagel, R. Kopiez, and E. Altenmüller. 2007. Emotions over time: Synchronicity and development of subjective, physiological, and facial affective reactions to music. Emotion 7, 4 (2007), 774--788. Google ScholarGoogle ScholarCross RefCross Ref
  18. O. Grewe, F. Nagel, R. Kopiez, and E. Altenmüller. 2005. How does music arouse “chills”? Ann. N. Y. Acad. Sci. 1060, 1 (2005), 446--449. Google ScholarGoogle ScholarCross RefCross Ref
  19. M. Grimshaw, C. A. Lindley, and L. Nacke. 2008. Sound and immersion in the first-person shooter: mixed measurement of the player's sonic experience. In Proceedings of Audio Mostly Conference. 1--7.Google ScholarGoogle Scholar
  20. G. Ilie and W. F. Thompson. 2006. A comparison of acoustic cues in music and speech for three dimensions of affect. Music Percept. 23, 4 (2006), 319--330. Google ScholarGoogle ScholarCross RefCross Ref
  21. J. J. Javela, R. E. Mercadillo, and J. M. Ramirez. 2008. Anger and associated experiences of sadness, fear, valence, arousal, and dominance evoked by visual scenes. Psychol. Rep. 103, 3 (2008), 663--681. Google ScholarGoogle ScholarCross RefCross Ref
  22. L. N. Jefferies, D. Smilek, E. Eich, and J. T. Enns. 2008. Emotional valence and arousal interact in attentional control. Psychol. Sci. 19, 3 (2008), 290--295. Google ScholarGoogle ScholarCross RefCross Ref
  23. K. Kallinen and N. Ravaja. 2006. Emotion perceived and emotion felt: Same and different. Music. Sci. 10, 2 (Sep. 2006), 191--213. Google ScholarGoogle ScholarCross RefCross Ref
  24. A. Kirke and E. R. Miranda. 2009. A survey of computer systems for expressive music performance. ACM Comput. Surv. 42, 1 (Dec. 2009), 1--41. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. M. M. Marin and J. Bhattacharya. 2010. Music induced emotions: Some current issues and cross-modal comparisons. Music Education. Nova Science Publishers, Hauppauge, NY, 1--38.Google ScholarGoogle Scholar
  26. A. Mattek. 2011. Emotional Communication in Computer Generated Music: Experimenting with Affective Algorithms. In Proceedings of the 26th Annual Conference of the Society for Electro-Acoustic Music in the United States.Google ScholarGoogle Scholar
  27. A. Mehrabian. 1996. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Curr. Psychol. 14, 4 (1996), 261--292. Google ScholarGoogle ScholarCross RefCross Ref
  28. S. Nielzén and Z. Cesarec. 1982. Emotional experience of music as a function of musical structure. Psychol. Music (1982).Google ScholarGoogle Scholar
  29. H. Purwins, B. Blankertz, and K. Obermayer. 2000. Computing auditory perception. Organised Sound 5, 3 (December 2000), 159--171. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. P. J. Rentfrow and S. D. Gosling. 2003. The do re mi's of everyday life: the structure and personality correlates of music preferences. J. Pers. Soc. Psychol. 84, 6 (2003), 1236. Google ScholarGoogle ScholarCross RefCross Ref
  31. R. Rowe. 1992. Interactive Music Systems: Machine Listening and Composing. MIT Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. J. A. Russell. 1980. A circumplex model of affect. J. Pers. Soc. Psychol. 39, 6 (1980), 1161. Google ScholarGoogle ScholarCross RefCross Ref
  33. K. R. Scherer. 2004. Which emotions can be induced by music? What are the underlying mechanisms? And how can we measure them? J. New Music Res. 33, 3 (2004), 239--251. Google ScholarGoogle ScholarCross RefCross Ref
  34. S. Shapiro and D. J. MacInnis. 2002. Understanding program-induced mood effects: Decoupling arousal from valence. J. Advert. 31, 4 (2002), 15--26. Google ScholarGoogle ScholarCross RefCross Ref
  35. Y. Visell. 2004. Spontaneous organisation, pattern models, and music. Organised Sound 9, 2 (Aug. 2004), 151--165. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. J. K. Vuoskoski and T. Eerola. 2012. Can sad music really make you sad? Indirect measures of affective states induced by music and autobiographical memories. Psychol. Aesthet. Creat. Arts 6, 3 (2012), 204.Google ScholarGoogle ScholarCross RefCross Ref
  37. J. K. Vuoskoski, W. F. Thompson, D. McIlwain, and T. Eerola. 2012. Who enjoys listening to sad music and why? Music Percept. 29, 3 (2012), 311--317. Google ScholarGoogle ScholarCross RefCross Ref
  38. A. Williamon and J. W. Davidson. 2002. Exploring co-performer communication. Music. Sci. 6, 1 (2002), 53--72. Google ScholarGoogle ScholarCross RefCross Ref
  39. D. Williams, A. Kirke, E. R. Miranda, E. Roesch, I. Daly, and S. Nasuto. 2014a. Investigating affect in algorithmic composition systems. Psychol. Music 43, 6 (Aug. 2014), 831--854. Google ScholarGoogle ScholarCross RefCross Ref
  40. D. Williams, A. Kirke, E. R. Miranda, E. Roesch, I. Daly, and S. Nasuto. 2014b. Evaluating perceptual separation in a pilot system for affective composition. In Proceedings of the 40th International Computer Music Conference.Google ScholarGoogle Scholar
  41. D. Williams, A. Kirke, E. R. Miranda, E. Roesch, I. Daly, and S. Nasuto. 2015. Investigating perceived emotional correlates of rhythmic density in algorithmic music composition. ACM Trans. Appl. Percept. 12, 3 (2015). Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Affective Calibration of Musical Feature Sets in an Emotionally Intelligent Music Composition System

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image ACM Transactions on Applied Perception
      ACM Transactions on Applied Perception  Volume 14, Issue 3
      July 2017
      148 pages
      ISSN:1544-3558
      EISSN:1544-3965
      DOI:10.1145/3066910
      Issue’s Table of Contents

      Copyright © 2017 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 10 May 2017
      • Revised: 1 January 2017
      • Accepted: 1 January 2017
      • Received: 1 January 2016
      Published in tap Volume 14, Issue 3

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader