Abstract
We developed a new musical instrument by equipping juggling balls with accelerometers, gyroscopes, and WiFi sensors. The system measures acceleration and rotation, allowing us to distinguish between ball movement and static states. We can determine the direction of each throw based on the acceleration data.
A movement-driven instrument animates the virtual audio source position based on the throw direction. A different synthesizer is assigned to each ball, resulting in a synesthetic experience. The system integrates the juggling patterns simulator with OSC and MIDI API, allowing real-time rhythm generation and composition.
- Peter Aberg. 2008. Tube act. https://www.youtube.com/watch?v=I0nl3244VQ4Google Scholar
- Till Bovermann, Jonas Groten, Alberto deCampo, and Gerhard Eckel. 2007. Juggling sounds. In Proceedings of the 2nd International Workshop on Interactive Sonification.Google Scholar
- Jesse Haydn Checkla. 2015. LED Juggling Balls with Pattern Detection. Ph. D. Dissertation. Cornell University.Google Scholar
- Jean-Philippe Côté. 2022. User-Friendly MIDI in the Web Browser. In NIME 2022 (The University of Auckland, New Zealand). PubPub. https://doi.org/10.21428/92fbeb44.388e4764Google Scholar
- Tomás Enrique, Thomas Gorbach, Hilda Tellioğlu, and Martin Kaltenbrunner. 2021. Embodied Gestures: Sculpting Energy-Motion Models into Musical Interfaces. In NIME 2021. PubPub. https://doi.org/10.21428/92fbeb44.ce8139a8Google Scholar
- Peter Maximilian Giller and Christian Schörkhuber. 2019. A Super-Resolution Ambisonics-to-Binaural Rendering Plug-In. Fortschritte der Akustik-DAGA, Rostock (2019).Google Scholar
- Ethan Hein. 2021. Ableton Live 11. Journal of the American Musicological Society 74, 1 (2021), 214--225.Google ScholarCross Ref
- Sergi Jordà, Günter Geiger, Marcos Alonso, and Martin Kaltenbrunner. 2007. The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces. In Proceedings of the 1st international conference on Tangible and embedded interaction. 139--146.Google ScholarDigital Library
- Martin Kaltenbrunner. 2023. Tangible music lab. https://tamlab.kunstuni-linz.at/Google Scholar
- Vojtech Leischner and Zdenek Mikovec. 2021. Spatial Audio Music Player for Web.. In RoCHI - International Conference on Human-Computer Interaction. MATRIX ROM, 125--129. https://doi.org/10.37789/rochi.2021.1.1.19Google Scholar
- Manu Mitterhuber. 2023. Festival. https://www.alterbauhof.at/ottosonics/festival/Google Scholar
- Michael Moschen. 2007. Michael Moschen performs the Triangle. https://www.youtube.com/watch?v=qjHoedoSUXYGoogle Scholar
- neunato. 2022. sani.js. https://github.com/neunato/sani.js original-date: 2017-10-09T14:37:30Z.Google Scholar
- William A. Sethares. 2007. Visualizing and Conceptualizing Rhythm. In Rhythm and Transforms. Springer, 23--51. https://doi.org/10.1007/978-1-84628-640-7_2Google Scholar
- Claude E Shannon. 1993. Scientific aspects of juggling., 924 pages.Google Scholar
- Juzzie Smith. 2023. Juzzie Smith. https://juzziesmith.com/Google Scholar
- Godfried Toussaint. 2005. The Euclidean algorithm generates traditional musical rhythms. In Renaissance Banff: Mathematics, Music, Art, Culture. 47--56.Google Scholar
- Peter Williams and Daniel Overholt. 2017. Beads Extended Actuated Digital Shaker. (2017). https://doi.org/10.5281/ZENODO.1176153 Publisher: Zenodo.Google Scholar
- Christopher Willits. 2023. Listen together. https://envelop.us/Google Scholar
Index Terms
- Sonification of a juggling performance using spatial audio
Recommendations
DJuggling: Sonification of expressive movement performance
SIGGRAPH '23: ACM SIGGRAPH 2023 Real-Time Live!In the real-time demo, we demonstrate how to create a musical performance using juggling movement as an instrument. We have equipped juggling balls with accelerometers, gyroscopes, and WiFi sensors. The system measures acceleration and rotation in a ...
Generative Audio and Real-Time Soundtrack Synthesis in Gaming Environments: An exploration of how dynamically rendered soundtracks can introduce new artistic sound design opportunities and enhance the immersion of interactive audio spaces.
OzCHI '20: Proceedings of the 32nd Australian Conference on Human-Computer InteractionAn important yet oft-overlooked front in the scope of interactive media, audio technologies have remained relatively stagnant compared to groundbreaking advancements made in fields such as visual fidelity and virtual reality. This paper explores the use ...
Sound space and spatial sampler
MOCO '15: Proceedings of the 2nd International Workshop on Movement and ComputingThe Spatial Sampler and the Sound Space are new musical instruments coming from the Synekine Project. This project involves performative and scientific researches for the creation of new ways to express ourselves. In a similar way a sampler is an empty ...
Comments