Attention-over-Attention Field-Aware Factorization Machine

Authors

  • Zhibo Wang Wuhan University
  • Jinxin Ma Wuhan University
  • Yongquan Zhang Wuhan University
  • Qian Wang Wuhan University
  • Ju Ren Tsinghua University
  • Peng Sun Zhejiang University

DOI:

https://doi.org/10.1609/aaai.v34i04.6101

Abstract

Factorization Machine (FM) has been a popular approach in supervised predictive tasks, such as click-through rate prediction and recommender systems, due to its great performance and efficiency. Recently, several variants of FM have been proposed to improve its performance. However, most of the state-of-the-art prediction algorithms neglected the field information of features, and they also failed to discriminate the importance of feature interactions due to the problem of redundant features. In this paper, we present a novel algorithm called Attention-over-Attention Field-aware Factorization Machine (AoAFFM) for better capturing the characteristics of feature interactions. Specifically, we propose the field-aware embedding layer to exploit the field information of features, and combine it with the attention-over-attention mechanism to learn both feature-level and interaction-level attention to estimate the weight of feature interactions. Experimental results show that the proposed AoAFFM improves FM and FFM with large margin, and outperforms state-of-the-art algorithms on three public benchmark datasets.

Downloads

Published

2020-04-03

How to Cite

Wang, Z., Ma, J., Zhang, Y., Wang, Q., Ren, J., & Sun, P. (2020). Attention-over-Attention Field-Aware Factorization Machine. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 6323-6330. https://doi.org/10.1609/aaai.v34i04.6101

Issue

Section

AAAI Technical Track: Machine Learning