Emotion-Aware Music Recommendation
DOI:
https://doi.org/10.1609/aaai.v37i13.26911Keywords:
Deep Learning, Affect, Computer Vision, Transformer, RecommenderAbstract
It is common to listen to songs that match one's mood. Thus, an AI music recommendation system that is aware of the user's emotions is likely to provide a superior user experience to one that is unaware. In this paper, we present an emotion-aware music recommendation system. Multiple models are discussed and evaluated for affect identification from a live image of the user. We propose two models: DRViT, which applies dynamic routing to vision transformers, and InvNet50, which uses involution. All considered models are trained and evaluated on the AffectNet dataset. Each model outputs the user's estimated valence and arousal under the circumplex model of affect. These values are compared to the valence and arousal values for songs in a Spotify dataset, and the top-five closest-matching songs are presented to the user. Experimental results of the models and user testing are presented.Downloads
Published
2024-07-15
How to Cite
Tran, H., Le, T., Do, A., Vu, T., Bogaerts, S., & Howard, B. (2024). Emotion-Aware Music Recommendation. Proceedings of the AAAI Conference on Artificial Intelligence, 37(13), 16087-16095. https://doi.org/10.1609/aaai.v37i13.26911
Issue
Section
EAAI Symposium: Human-Aware AI in Sound and Music