Abstract
This paper describes a prototype audio-visual performance of a Ufactory Uarm Swift and a live musician. In this setting, the robotic arm was used as an AI agent to create a visual representation of a musical work in real-time. An A4 white canvas was gradually filled with a mixture of black, blue, red and yellow paints across the span of approximately eight minutes. The musician, performing on an acoustic violin, fitted with a custom built audio interface, performed multiple versions of an improvisatory work developed specifically for the prototype performance. The following sections discuss our technical approach to programming and implementing the Ufactory Uarm Swift as a painting arm, reflections of the musical process and propose future directions for this project.
| Original language | English |
|---|---|
| Title of host publication | HAI '22 |
| Subtitle of host publication | Proceedings of the 10th Conference on Human-Agent Interaction |
| Place of Publication | New York |
| Publisher | Association for Computing Machinery (ACM) |
| Pages | 253-255 |
| Number of pages | 3 |
| ISBN (Electronic) | 9781450393232 |
| DOIs | |
| Publication status | Published - 5 Dec 2022 |
| Event | International Conference on Human-Agent Interaction (10th : 2022) - Christchurch, New Zealand Duration: 5 Dec 2022 → 8 Dec 2022 |
Conference
| Conference | International Conference on Human-Agent Interaction (10th : 2022) |
|---|---|
| Abbreviated title | HAI ’22 |
| Country/Territory | New Zealand |
| City | Christchurch |
| Period | 5/12/22 → 8/12/22 |
Keywords
- artwork
- audio
- human-robot interaction
- music
- painting
- sound
Fingerprint
Dive into the research topics of 'Robotic arm generative painting through real-time analysis of music performance'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver