MoMusic: A Motion-Driven Human-AI Collaborative Music Composition and Performing System

Authors

  • Weizhen Bian Hong Kong Baptist University
  • Yijin Song Hong Kong Baptist University
  • Nianzhen Gu Hong Kong Baptist University
  • Tin Yan Chan Hong Kong Baptist University
  • Tsz To Lo Hong Kong Baptist University
  • Tsun Sun Li Hong Kong Baptist University
  • King Chak Wong Hong Kong Baptist University
  • Wei Xue Hong Kong Baptist University
  • Roberto Alonso Trillo Hong Kong Baptist University

DOI:

https://doi.org/10.1609/aaai.v37i13.26907

Keywords:

Music Generation, Motion Detection, Voice Conversion, Human-Computer Interaction

Abstract

The significant development of artificial neural network architectures has facilitated the increasing adoption of automated music composition models over the past few years. However, most existing systems feature algorithmic generative structures based on hard code and predefined rules, generally excluding interactive or improvised behaviors. We propose a motion based music system, MoMusic, as a AI real time music generation system. MoMusic features a partially randomized harmonic sequencing model based on a probabilistic analysis of tonal chord progressions, mathematically abstracted through musical set theory. This model is presented against a dual dimension grid that produces resulting sounds through a posture recognition mechanism. A camera captures the users' fingers' movement and trajectories, creating coherent, partially improvised harmonic progressions. MoMusic integrates several timbrical registers, from traditional classical instruments such as the piano to a new ''human voice instrument'' created using a voice conversion technique. Our research demonstrates MoMusic's interactiveness, ability to inspire musicians, and ability to generate coherent musical material with various timbrical registers. MoMusic's capabilities could be easily expanded to incorporate different forms of posture controlled timbrical transformation, rhythmic transformation, dynamic transformation, or even digital sound processing techniques.

Downloads

Published

2024-07-15

How to Cite

Bian, W., Song, Y., Gu, N., Chan, T. Y., Lo, T. T., Li, T. S., Wong, K. C., Xue, W., & Alonso Trillo, R. (2024). MoMusic: A Motion-Driven Human-AI Collaborative Music Composition and Performing System. Proceedings of the AAAI Conference on Artificial Intelligence, 37(13), 16057-16062. https://doi.org/10.1609/aaai.v37i13.26907

Issue

Section

EAAI Symposium: Human-Aware AI in Sound and Music