Multi-Document Transformer for Personality Detection

Authors

  • Feifan Yang Sun Yat-sen University
  • Xiaojun Quan Sun Yat-sen University
  • Yunyi Yang Sun Yat-sen University
  • Jianxing Yu Sun Yat-sen University

DOI:

https://doi.org/10.1609/aaai.v35i16.17673

Keywords:

Text Classification & Sentiment Analysis

Abstract

Personality detection aims to identify the personality traits implied in social media posts. The core of this task is to put together information in multiple scattered posts to depict an overall personality profile for each user. Existing approaches either encode each post individually or assemble posts arbitrarily into a new document that can be encoded sequentially or hierarchically. While the first approach ignores the connection between posts, the second tends to introduce unnecessary post-order bias into posts. In this paper, we propose a multi-document Transformer, namely Transformer-MD, to tackle the above issues. When encoding each post, Transformer-MD allows access to information in the other posts of the user through Transformer-XL’s memory tokens which share the same position embedding.Besides, personality is usually defined along different traits and each trait may need to attend to different post information, which has rarely been touched by existing research. To address this concern, we propose a dimension attention mechanism on top of Transformer-MD to obtain trait-specific representations for multi-trait personality detection. We evaluate the proposed model on the Kaggle and Pandora MBTI datasets and the experimental results show that it compares favorably with baseline methods.

Downloads

Published

2021-05-18

How to Cite

Yang, F., Quan, X., Yang, Y., & Yu, J. (2021). Multi-Document Transformer for Personality Detection. Proceedings of the AAAI Conference on Artificial Intelligence, 35(16), 14221-14229. https://doi.org/10.1609/aaai.v35i16.17673

Issue

Section

AAAI Technical Track on Speech and Natural Language Processing III