skip to main content
review-article
Free Access

A survey of robotic musicianship

Published:26 April 2016Publication History
Skip Abstract Section

Abstract

Reviewing the technologies that enable robot musicians to jam.

Skip Supplemental Material Section

Supplemental Material

References

  1. Albin, A., Weinberg, G. and Egerstedt, M. Musical abstractions in distributed multi-robot systems. In Proceeding of the International Conference on Intelligent Robots and Systems (2012). IEEE/RSJ, 451--458.Google ScholarGoogle ScholarCross RefCross Ref
  2. Atkeson, C.G. et al. Using humanoid robots to study human behavior. Intelligent Systems and their Applications 15, 4 (2000). IEEE, 46--56. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Barton, S. The human, the mechanical, and the spaces in between: Explorations in human-robotic musical improvisation. In Proceeding of the 9th Artificial Intelligence and Interactive Digital Entertainment Conference, 2013.Google ScholarGoogle Scholar
  4. Bello, J.P., Daudet, L., Abdallah, S., Duxbury, C., Davies, M. and Sandler, M.B. A tutorial on onset detection in music signals. IEEE Transactions on Speech and Audio Processing 13, 5 (2005), 1035--1047.Google ScholarGoogle ScholarCross RefCross Ref
  5. Bretan, M., Cicconet, M., Nikolaidis, R. and Weinberg, G. Developing and composing for a robotic musician. In Proceeding of the International Computer Music Conference (Ljubljana, Slovenia, Sept. 2012).Google ScholarGoogle Scholar
  6. Chadefaux, D., Le Carrou, J-L, Vitrani, M-A, Billout, S. and Quartier, L. Harp plucking robotic finger. In Proceeding of the International Conference on Intelligent Robots and Systems (2012) IEEE/RSJ, 4886--4891.Google ScholarGoogle Scholar
  7. Chen, R., Shen, W., Srinivasamurthy, A. and Chordia, P. Chord recognition using duration-explicit hidden markov models. In Proceedings of ISMIR (2012), 445--450.Google ScholarGoogle Scholar
  8. Cicconet, M., Bretan, M. and Weinberg, G. Human-robot percussion ensemble: Anticipation on the basis of visual cues. Robotics & Automation Magazine 20, 4 (2013), IEEE, 105--110.Google ScholarGoogle ScholarCross RefCross Ref
  9. Collins, N.M. Towards autonomous agents for live computer music: Real-time machine listening and interactive music systems. Ph.D. thesis, Citeseer, 2006.Google ScholarGoogle Scholar
  10. Dannenberg, R.B., Brown, H.B. and Lupish, R. Mcblare: A robotic bagpipe player. In Musical Robots and Interactive Multimodal Systems. Springer, 2011, 165--178.Google ScholarGoogle Scholar
  11. Ferrand, D. and Vergez, C. Blowing machine for wind musical instrument: toward a real-time control of the blowing pressure. In Proceedings of the 16th Mediterranean Conference on Control and Automation. IEEE, 2008, 1562--1567.Google ScholarGoogle Scholar
  12. Hochenbaum, J. and Kapur, K. Drum stroke computing: Multimodal signal processing for drum stroke identification and performance metrics. In Proceedings of the 2012 International Conference on New Interfaces for Musical Expression.Google ScholarGoogle Scholar
  13. Hoffman, G. and Weinberg, G. Synchronization in human-robot musicianship. In Proceeding of 2010 RO-MAN. IEEE, 718--724.Google ScholarGoogle Scholar
  14. Jordà, S. Afasia: the ultimate Homeric one-man-multimedia-band. In Proceedings of the 2002 Conference on New Interfaces for Musical Expression. National University of Singapore, 1--6. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Kapur, A. A history of robotic musical instruments. In Proceedings of the 2005 International Computer Music Conference, 21--28.Google ScholarGoogle Scholar
  16. Kapur, A. Multimodal techniques for human/robot interaction. Musical Robots and Interactive Multimodal Systems. Springer, 2011, 215--232.Google ScholarGoogle Scholar
  17. Kapur, A., Murphy, J. and Carnegie, D. Kritaanjli: A robotic harmonium for performance, pedogogy and research.Google ScholarGoogle Scholar
  18. Kapur, A., Trimpin, E.S., Suleman, A. and Tzanetakis, G. A comparison of solenoid-based strategies for robotic drumming. In Proceedings of 2007 ICMC (Copenhagen, Denmark, 2007).Google ScholarGoogle Scholar
  19. I. Kato, I., Ohteru, S., Shirai, K., Matsushima, T., Narita, S., Sugano, S., Kobayashi, T. and Fujisawa, E. The robot musician 'wabot-2' (waseda robot-2). Robotics 3, 2 (1987), 143--155.Google ScholarGoogle ScholarCross RefCross Ref
  20. Kidd, C.D. and Breazeal, C. Effect of a robot on user perceptions. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, 3559--3564.Google ScholarGoogle Scholar
  21. Lerch, A. An Introduction to Audio Content Analysis: Applications in Signal Processing and Music Informatics. John Wiley & Sons, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Lim, A., Mizumoto, T., Cahier, L-K, Otsuka, T., Takahashi, T., Komatani, K., Ogata, T. and Okuno, H.G. Robot musical accompaniment: Integrating audio and visual cues for real-time synchronization with a human flutist. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, 1964--1969.Google ScholarGoogle Scholar
  23. Logan-Greene, R. Submersions I. University of Washington, 2011.Google ScholarGoogle Scholar
  24. Maes, L., Raes, G-W. and Rogers, T. The man and machine robot orchestra at logos. Computer Music Journal 35, 4 (2011), 28--48. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Miranda, E.R. and Tikhanoff, V. Musical composition by autonomous robots: A case study with aibo. In Proceedings of 2005 Conference Towards Autonomous Robotic Systems.Google ScholarGoogle Scholar
  26. Mizumoto, T., Lim, A., Otsuka, T., Nakadai, K., Takahashi, T., Ogata, T. and Okuno, H.G. Integration of flutist gesture recognition and beat tracking for human-robot ensemble. In Proceedings of IEEE/RSJ-2010 Workshop on Robots and Musical Expression, 159--171.Google ScholarGoogle Scholar
  27. Nikolaidis, R. and Weinberg, G. Playing with the masters: A model for improvisatory musical interaction between robots and humans. In Proceedings of RO-MAN 2010. IEEE, 210--217.Google ScholarGoogle ScholarCross RefCross Ref
  28. Otsuka, T., Nakadai, K., Takahashi, T., Ogata, T. and Okuno, H.G. Real-time audio-to-score alignment using particle filter for coplayer music robots. EURASIP J. Advances in Signal Processing 2 (2011). Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Paiva, R.P., Mendes, T. and Cardoso, A. On the detection of melody notes in polyphonic audio. In Proceedings of 2005 ISMIR, 175--182.Google ScholarGoogle Scholar
  30. Pan, Y., Kim, M-G, and Suzuki, K. A robot musician interacting with a human partner through initiative exchange. In Proceedings of the 2010 Conference on New Interfaces for Musical Expression, 166--169.Google ScholarGoogle Scholar
  31. Rowe, R. Machine Musicianship. The MIT Press, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Shibuya, K., Ideguchi, H. and Ikushima, K. Volume control by adjusting wrist moment of violin-playing robot. International J. Synthetic Emotions 3, 2 (2012), 31--47. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Singer, E., Feddersen, J., Redmon, C. and Bowen, B. Lemur's musical robots. In Proceedings of the 2004 Conference on New Interfaces for Musical Expression. National University of Singapore, 181--184. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Solis, J., Chida, K., Isoda, S. Suefuji, K., Arino, C. and Takanishi, A. The anthropomorphic flutist robot wf-4r: From mechanical to perceptual improvements. In Proceeding of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, 64--69.Google ScholarGoogle Scholar
  35. Solis, J., Suefuji, K., Taniguchi, K., Ninomiya, T., Maeda, M. and Takanishi, A. Implementation of expressive performance rules on the wf-4riii by modeling a professional flutist performance using nn. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, 2552--2557.Google ScholarGoogle Scholar
  36. Solis, J., Takanishi, A. and Hashimoto, K. Development of an anthropomorphic saxophone-playing robot. Brain, Body and Machine. Springer, 2010, 175--186.Google ScholarGoogle Scholar
  37. Weinberg, G. and Driscoll, S. Toward robotic musicianship. Computer Music Journal 30, 4 (2006), 28--45. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Weinberg, G. and Driscoll, S. The design of a perceptual and improvisational robotic marimba player. In Proceedings of the 16th IEEE International Symposium on Robot and Human interactive Communication, 2007, 769--774.Google ScholarGoogle ScholarCross RefCross Ref
  39. Weinberg, G., Godfrey, M., Rae, A. and Rhoads, J. A real-time genetic algorithm in human-robot musical improvisation. Computer Music Modeling and Retrieval. Sense of Sounds. Springer, 2008, 351--359. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Williamson, M.M. Robot arm control exploiting natural dynamics. Ph.D. thesis. Massachusetts Institute of Technology, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Zhang, A., Malhotra, M. and Matsuoka, Y. Musical piano performance by the act hand. In Proceedings of the IEEE International Conference on Robotics and Automation. IEEE, 2011, 3536--3541.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. A survey of robotic musicianship

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Published in

        cover image Communications of the ACM
        Communications of the ACM  Volume 59, Issue 5
        May 2016
        121 pages
        ISSN:0001-0782
        EISSN:1557-7317
        DOI:10.1145/2930840
        • Editor:
        • Moshe Y. Vardi
        Issue’s Table of Contents

        Copyright © 2016 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 26 April 2016

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • review-article
        • Popular
        • Refereed

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format