An Application for Mental Health Monitoring Using Facial, Voice, and Questionnaire Information

Authors

  • Suphalerk Boonvitchaikul Chulalongkorn University
  • Napat Cheetanom Chulalongkorn University
  • Tagon Sompong Chulalongkorn University
  • Jirapat Sununtnasuk Chulalongkorn University
  • Siri Thammarerkrit Chulalongkorn University
  • Pattaraporn Pongpanatapipat Chulalongkorn University
  • Punnaphoj Aeuepalisa Vulcan Coalition
  • Ananya Kuasakunrungroj Vulcan Coalition
  • Chatavut Viriyasuthee Vulcan Coalition
  • Patawee Prakrankamanant Chulalongkorn University
  • Sorawit Wainipitapong Chulalongkorn University
  • Ekapol Chuangsuwanich Chulalongkorn University

DOI:

https://doi.org/10.1609/aaaiss.v1i1.27465

Keywords:

Depression Detection, Multimodal, HAM-D

Abstract

Depression is a major societal issue. However, depression can be hard to self-diagnose, and people suffering from depression often hesitate to consult with professionals. We discuss the design and initial testings of our prototype application that performs depression detection using multi-modal information such as questionnaires, speech, and face landmarks. The application has an animated avatar ask questions concerning the users’ well-being. To perform screening, we opt for a 2-stage method which first predicts individual HAM-D ratings for better explainability, which may help facilitate the referral process to medical professionals if required. Initial results show that our system archives 0.85 Marco-F1 for the depression detection task.

Downloads

Published

2023-10-03