C-NTPP: Learning Cluster-Aware Neural Temporal Point Process
DOI:
https://doi.org/10.1609/aaai.v37i6.25897Keywords:
ML: Time-Series/Data Streams, ML: Applications, ML: Bayesian Learning, ML: Classification and Regression, ML: Clustering, ML: Deep Generative Models & Autoencoders, ML: Deep Neural Architectures, ML: Deep Neural Network Algorithms, ML: Representation Learning, RU: Stochastic Models & Probabilistic InferenceAbstract
Event sequences in continuous time space are ubiquitous across applications and have been intensively studied with both classic temporal point process (TPP) and its recent deep network variants. This work is motivated by an observation that many of event data exhibit inherent clustering patterns in terms of the sparse correlation among events, while such characteristics are seldom explicitly considered in existing neural TPP models whereby the history encoders are often embodied by RNNs or Transformers. In this work, we propose a c-NTPP (Cluster-Aware Neural Temporal Point Process) model, which leverages a sequential variational autoencoder framework to infer the latent cluster each event belongs to in the sequence. Specially, a novel event-clustered attention mechanism is devised to learn each cluster and then aggregate them together to obtain the final representation for each event. Extensive experiments show that c-NTPP achieves superior performance on both real-world and synthetic datasets, and it can also uncover the underlying clustering correlations.Downloads
Published
2023-06-26
How to Cite
Ding, F., Yan, J., & Wang, H. (2023). C-NTPP: Learning Cluster-Aware Neural Temporal Point Process. Proceedings of the AAAI Conference on Artificial Intelligence, 37(6), 7369-7377. https://doi.org/10.1609/aaai.v37i6.25897
Issue
Section
AAAI Technical Track on Machine Learning I