CrowdBand: An Automated Crowdsourcing Sound Composition System

Authors

  • Mary Pietrowicz University of Illinois at Urbana-Champaign
  • Danish Chopra University of Illinois at Urbana-Champaign
  • Amin Sadeghi University of Illinois at Urbana-Champaign
  • Puneet Chandra University of Illinois at Urbana-Champaign
  • Brian Bailey Unversity of Illinois at Urbana-Champaign
  • Karrie Karahalios University of Illinois at Urbana-Champaign

DOI:

https://doi.org/10.1609/hcomp.v1i1.13080

Keywords:

Crowdsourcing, music, sound, composition

Abstract

CrowdBand, a sound composition system, demonstrates how a crowd can create works that meet requested criteria and fulfill the aesthetic character given by keyword description and examples. CrowdBand allows flexibility in music composition in terms of duration of the music, completion time and cost of music composition by giving the requestor two modes - thrifty and normal. CrowdBand’s workflow divides the composition task into three sections: requesting fundamental sounds, assembling sounds into compositions, and evaluating the results. Based on the crowd workers’ responses, we conclude that crowdsourced workers who are non-musicians can design sound and create novel sound compositions through CrowdBand. We also conclude that CrowdBand gives the musically-untrained crowd workers the ability to use common compositional techniques, such as sound layering, vertical stacking of sounds to create harmonic effects, related melodic lines (contrapuntal techniques), and transitions between aesthetic notions, or sound themes. Finally, we show improved, faster results with successive simplification and examples.

Downloads

Published

2013-11-03

How to Cite

Pietrowicz, M., Chopra, D., Sadeghi, A., Chandra, P., Bailey, B., & Karahalios, K. (2013). CrowdBand: An Automated Crowdsourcing Sound Composition System. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 1(1), 121-129. https://doi.org/10.1609/hcomp.v1i1.13080