Effects of Deep Neural Networks on the Perceived Creative Autonomy of a Generative Musical System

Authors

  • Jason Smith Georgia Institute of Technology
  • Jason Freeman Georgia Institute of Technology

DOI:

https://doi.org/10.1609/aiide.v17i1.18895

Keywords:

Collaborative Artificial Intelligence, Computational Creativity, Deep Learning, Music

Abstract

Collaborative AI agents allow for human-computer collaboration in interactive software. In creative spaces such as musical performance, they are able to exhibit creative autonomy through independent actions and decision-making. These systems, called co-creative systems, autonomously control some aspects of the creative process while a human musician manages others. When users perceive a co-creative system to be more autonomous, they may be willing to cede more creative control to it, leading to an experience that users may find more expressive and engaging. This paper describes the design and implementation of a co-creative musical system that captures gestural motion and uses that motion to filter pre-existing audio content. The system hosts two neural network architectures, enabling comparison of their use as a collaborative musical agent. This paper also presents a preliminary study in which subjects recorded short musical performances using this software while alternating between deep and shallow models. The analysis includes a comparison of users' perceptions of the two models' creative roles and the models' impact on the subjects' sense of self-expression.

Downloads

Published

2021-10-04

How to Cite

Smith, J., & Freeman, J. (2021). Effects of Deep Neural Networks on the Perceived Creative Autonomy of a Generative Musical System. Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment, 17(1), 91-98. https://doi.org/10.1609/aiide.v17i1.18895