Linking Transformer to Hawkes Process for Information Cascade Prediction (Student Abstract)

Authors

  • Liu Yu University of Electronic Science and Technology of China
  • Xovee Xu University of Electronic Science and Technology of China
  • Ting Zhong University of Electronic Science and Technology of China
  • Goce Trajcevski Iowa State University
  • Fan Zhou University of Electronic Science and Technology of China

DOI:

https://doi.org/10.1609/aaai.v36i11.21688

Keywords:

Information Cascade, Hawkes Process, Attention Mechanism

Abstract

Information cascade is typically formalized as a process of (simplified) discrete sequence of events, and recent approaches have tackled its prediction via variants of recurrent neural networks. However, the information diffusion process is essentially an evolving directed acyclic graph (DAG) in the continuous-time domain. In this paper, we propose a transformer enhanced Hawkes process (Hawkesformer), which links the hierarchical attention mechanism with Hawkes process to model the arrival stream of discrete events continuously. A two-level attention architecture is used to parameterize the intensity function of Hawkesformer, which captures the long-term dependencies between nodes in graph and better embeds the cascade evolution rate for modeling short-term outbreaks. Experimental results demonstrate the significant improvements of Hawkesformer over the state-of-the-art.

Downloads

Published

2022-06-28

How to Cite

Yu, L., Xu, X., Zhong, T., Trajcevski, G., & Zhou, F. (2022). Linking Transformer to Hawkes Process for Information Cascade Prediction (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 36(11), 13103-13104. https://doi.org/10.1609/aaai.v36i11.21688