Understanding Social Interpersonal Interaction via Synchronization Templates of Facial Events

Authors

  • Rui Li Rochester Institute of Technology
  • Jared Curhan Massachussets Institute of Technology
  • Mohammed Hoque University of Rochester

DOI:

https://doi.org/10.1609/aaai.v32i1.11514

Keywords:

Facial expression, Social interaction, Interactional synchrony, Video conferencing, Coupled hidden Markov model, Beta-Bernoulli process, Bayes nonparametrics, Gibbs sampling

Abstract

Automatic facial expression analysis in inter-personal communication is challenging. Not only because conversation partners' facial expressions mutually influence each other, but also because no correct interpretation of facial expressions is possible without taking social context into account. In this paper, we propose a probabilistic framework to model interactional synchronization between conversation partners based on their facial expressions. Interactional synchronization manifests temporal dynamics of conversation partners' mutual influence. In particular, the model allows us to discover a set of common and unique facial synchronization templates directly from natural interpersonal interaction without recourse to any predefined labeling schemes. The facial synchronization templates represent periodical facial event coordinations shared by multiple conversation pairs in a specific social context. We test our model on two different dyadic conversations of negotiation and job-interview. Based on the discovered facial event coordination, we are able to predict their conversation outcomes with higher accuracy than HMMs and GMMs.

Downloads

Published

2018-04-25

How to Cite

Li, R., Curhan, J., & Hoque, M. (2018). Understanding Social Interpersonal Interaction via Synchronization Templates of Facial Events. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11514

Issue

Section

AAAI Technical Track: Human-Computation and Crowd Sourcing