Supplemental Material
Available for Download
Supplemental material.
- Albin, A., Weinberg, G. and Egerstedt, M. Musical abstractions in distributed multi-robot systems. In Proceeding of the International Conference on Intelligent Robots and Systems (2012). IEEE/RSJ, 451--458.Google ScholarCross Ref
- Atkeson, C.G. et al. Using humanoid robots to study human behavior. Intelligent Systems and their Applications 15, 4 (2000). IEEE, 46--56. Google ScholarDigital Library
- Barton, S. The human, the mechanical, and the spaces in between: Explorations in human-robotic musical improvisation. In Proceeding of the 9th Artificial Intelligence and Interactive Digital Entertainment Conference, 2013.Google Scholar
- Bello, J.P., Daudet, L., Abdallah, S., Duxbury, C., Davies, M. and Sandler, M.B. A tutorial on onset detection in music signals. IEEE Transactions on Speech and Audio Processing 13, 5 (2005), 1035--1047.Google ScholarCross Ref
- Bretan, M., Cicconet, M., Nikolaidis, R. and Weinberg, G. Developing and composing for a robotic musician. In Proceeding of the International Computer Music Conference (Ljubljana, Slovenia, Sept. 2012).Google Scholar
- Chadefaux, D., Le Carrou, J-L, Vitrani, M-A, Billout, S. and Quartier, L. Harp plucking robotic finger. In Proceeding of the International Conference on Intelligent Robots and Systems (2012) IEEE/RSJ, 4886--4891.Google Scholar
- Chen, R., Shen, W., Srinivasamurthy, A. and Chordia, P. Chord recognition using duration-explicit hidden markov models. In Proceedings of ISMIR (2012), 445--450.Google Scholar
- Cicconet, M., Bretan, M. and Weinberg, G. Human-robot percussion ensemble: Anticipation on the basis of visual cues. Robotics & Automation Magazine 20, 4 (2013), IEEE, 105--110.Google ScholarCross Ref
- Collins, N.M. Towards autonomous agents for live computer music: Real-time machine listening and interactive music systems. Ph.D. thesis, Citeseer, 2006.Google Scholar
- Dannenberg, R.B., Brown, H.B. and Lupish, R. Mcblare: A robotic bagpipe player. In Musical Robots and Interactive Multimodal Systems. Springer, 2011, 165--178.Google Scholar
- Ferrand, D. and Vergez, C. Blowing machine for wind musical instrument: toward a real-time control of the blowing pressure. In Proceedings of the 16th Mediterranean Conference on Control and Automation. IEEE, 2008, 1562--1567.Google Scholar
- Hochenbaum, J. and Kapur, K. Drum stroke computing: Multimodal signal processing for drum stroke identification and performance metrics. In Proceedings of the 2012 International Conference on New Interfaces for Musical Expression.Google Scholar
- Hoffman, G. and Weinberg, G. Synchronization in human-robot musicianship. In Proceeding of 2010 RO-MAN. IEEE, 718--724.Google Scholar
- Jordà, S. Afasia: the ultimate Homeric one-man-multimedia-band. In Proceedings of the 2002 Conference on New Interfaces for Musical Expression. National University of Singapore, 1--6. Google ScholarDigital Library
- Kapur, A. A history of robotic musical instruments. In Proceedings of the 2005 International Computer Music Conference, 21--28.Google Scholar
- Kapur, A. Multimodal techniques for human/robot interaction. Musical Robots and Interactive Multimodal Systems. Springer, 2011, 215--232.Google Scholar
- Kapur, A., Murphy, J. and Carnegie, D. Kritaanjli: A robotic harmonium for performance, pedogogy and research.Google Scholar
- Kapur, A., Trimpin, E.S., Suleman, A. and Tzanetakis, G. A comparison of solenoid-based strategies for robotic drumming. In Proceedings of 2007 ICMC (Copenhagen, Denmark, 2007).Google Scholar
- I. Kato, I., Ohteru, S., Shirai, K., Matsushima, T., Narita, S., Sugano, S., Kobayashi, T. and Fujisawa, E. The robot musician 'wabot-2' (waseda robot-2). Robotics 3, 2 (1987), 143--155.Google ScholarCross Ref
- Kidd, C.D. and Breazeal, C. Effect of a robot on user perceptions. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, 3559--3564.Google Scholar
- Lerch, A. An Introduction to Audio Content Analysis: Applications in Signal Processing and Music Informatics. John Wiley & Sons, 2012. Google ScholarDigital Library
- Lim, A., Mizumoto, T., Cahier, L-K, Otsuka, T., Takahashi, T., Komatani, K., Ogata, T. and Okuno, H.G. Robot musical accompaniment: Integrating audio and visual cues for real-time synchronization with a human flutist. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, 1964--1969.Google Scholar
- Logan-Greene, R. Submersions I. University of Washington, 2011.Google Scholar
- Maes, L., Raes, G-W. and Rogers, T. The man and machine robot orchestra at logos. Computer Music Journal 35, 4 (2011), 28--48. Google ScholarDigital Library
- Miranda, E.R. and Tikhanoff, V. Musical composition by autonomous robots: A case study with aibo. In Proceedings of 2005 Conference Towards Autonomous Robotic Systems.Google Scholar
- Mizumoto, T., Lim, A., Otsuka, T., Nakadai, K., Takahashi, T., Ogata, T. and Okuno, H.G. Integration of flutist gesture recognition and beat tracking for human-robot ensemble. In Proceedings of IEEE/RSJ-2010 Workshop on Robots and Musical Expression, 159--171.Google Scholar
- Nikolaidis, R. and Weinberg, G. Playing with the masters: A model for improvisatory musical interaction between robots and humans. In Proceedings of RO-MAN 2010. IEEE, 210--217.Google ScholarCross Ref
- Otsuka, T., Nakadai, K., Takahashi, T., Ogata, T. and Okuno, H.G. Real-time audio-to-score alignment using particle filter for coplayer music robots. EURASIP J. Advances in Signal Processing 2 (2011). Google ScholarDigital Library
- Paiva, R.P., Mendes, T. and Cardoso, A. On the detection of melody notes in polyphonic audio. In Proceedings of 2005 ISMIR, 175--182.Google Scholar
- Pan, Y., Kim, M-G, and Suzuki, K. A robot musician interacting with a human partner through initiative exchange. In Proceedings of the 2010 Conference on New Interfaces for Musical Expression, 166--169.Google Scholar
- Rowe, R. Machine Musicianship. The MIT Press, 2004. Google ScholarDigital Library
- Shibuya, K., Ideguchi, H. and Ikushima, K. Volume control by adjusting wrist moment of violin-playing robot. International J. Synthetic Emotions 3, 2 (2012), 31--47. Google ScholarDigital Library
- Singer, E., Feddersen, J., Redmon, C. and Bowen, B. Lemur's musical robots. In Proceedings of the 2004 Conference on New Interfaces for Musical Expression. National University of Singapore, 181--184. Google ScholarDigital Library
- Solis, J., Chida, K., Isoda, S. Suefuji, K., Arino, C. and Takanishi, A. The anthropomorphic flutist robot wf-4r: From mechanical to perceptual improvements. In Proceeding of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, 64--69.Google Scholar
- Solis, J., Suefuji, K., Taniguchi, K., Ninomiya, T., Maeda, M. and Takanishi, A. Implementation of expressive performance rules on the wf-4riii by modeling a professional flutist performance using nn. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, 2552--2557.Google Scholar
- Solis, J., Takanishi, A. and Hashimoto, K. Development of an anthropomorphic saxophone-playing robot. Brain, Body and Machine. Springer, 2010, 175--186.Google Scholar
- Weinberg, G. and Driscoll, S. Toward robotic musicianship. Computer Music Journal 30, 4 (2006), 28--45. Google ScholarDigital Library
- Weinberg, G. and Driscoll, S. The design of a perceptual and improvisational robotic marimba player. In Proceedings of the 16th IEEE International Symposium on Robot and Human interactive Communication, 2007, 769--774.Google ScholarCross Ref
- Weinberg, G., Godfrey, M., Rae, A. and Rhoads, J. A real-time genetic algorithm in human-robot musical improvisation. Computer Music Modeling and Retrieval. Sense of Sounds. Springer, 2008, 351--359. Google ScholarDigital Library
- Williamson, M.M. Robot arm control exploiting natural dynamics. Ph.D. thesis. Massachusetts Institute of Technology, 1999. Google ScholarDigital Library
- Zhang, A., Malhotra, M. and Matsuoka, Y. Musical piano performance by the act hand. In Proceedings of the IEEE International Conference on Robotics and Automation. IEEE, 2011, 3536--3541.Google ScholarCross Ref
Index Terms
- A survey of robotic musicianship
Recommendations
Robotic musicianship and musical human augmentation
HRI '19: Proceedings of the 14th ACM/IEEE International Conference on Human-Robot InteractionRobotic Musicianship research at Georgia Tech Center for Music Technology (GTCMT) focuses on the construction of autonomous and wearable robots that can analyze, reason, and generate music. The goal of our research is to facilitate meaningful and ...
Sound compositions for expanding musicianship education
Composing has been slighted at all levels of education. Following an analysis of the history and failure of compositional pedagogy for developing musicianship, a new rationale for such pedagogy is presented. This pedagogy is argued to be essential for ...
Integrating hyperinstruments, musical robots & machine musicianship for North Indian classical music
NIME '07: Proceedings of the 7th international conference on New interfaces for musical expressionThis paper describes a system enabling a human to perform music with a robot in real-time, in the context of North Indian classical music. We modify a traditional acoustic sitar into a hyperinstrument in order to capture performance gestures for musical ...
Comments