Tackling Mental Health by Integrating Unobtrusive Multimodal Sensing

Authors

  • Dawei Zhou University of Rochester
  • Jiebo Luo University of Rochester
  • Vincent Silenzio University of Rochester Medical Center
  • Yun Zhou University of Rochester
  • Jile Hu University of Rochester
  • Glenn Currier University of Rochester Medical Center
  • Henry Kautz University of Rochester

DOI:

https://doi.org/10.1609/aaai.v29i1.9381

Keywords:

mental health, affect signals, multimodal analysis, social media, computer vision, machine learning

Abstract

Mental illness is becoming a major plague in modern societies and poses challenges to the capacity of current public health systems worldwide. With the widespread adoption of social media and mobile devices, and rapid advances in artificial intelligence, a unique opportunity arises for tackling mental health problems. In this study, we investigate how users’ online social activities and physiological signals detected through ubiquitous sensors can be utilized in realistic scenarios for monitoring their mental health states. First, we extract a suite of multimodal time-series signals using modern computer vision and signal processing techniques, from recruited participants while they are immersed in online social media that elicit emotions and emotion transitions. Next, we use machine learning techniques to build a model that establishes the connection between mental states and the extracted multimodal signals. Finally, we validate the effectiveness of our approach using two groups of recruited subjects.

Downloads

Published

2015-02-16

How to Cite

Zhou, D., Luo, J., Silenzio, V., Zhou, Y., Hu, J., Currier, G., & Kautz, H. (2015). Tackling Mental Health by Integrating Unobtrusive Multimodal Sensing. Proceedings of the AAAI Conference on Artificial Intelligence, 29(1). https://doi.org/10.1609/aaai.v29i1.9381