TY - JOUR AU - Soen, Alexander AU - Mathews, Alexander AU - Grixti-Cheng, Daniel AU - Xie, Lexing PY - 2021/05/18 Y2 - 2024/03/28 TI - UNIPoint: Universally Approximating Point Processes Intensities JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 35 IS - 11 SE - AAAI Technical Track on Machine Learning IV DO - 10.1609/aaai.v35i11.17165 UR - https://ojs.aaai.org/index.php/AAAI/article/view/17165 SP - 9685-9694 AB - Point processes are a useful mathematical tool for describing events over time, and so there are many recent approaches for representing and learning them. One notable open question is how to precisely describe the flexibility of point process models and whether there exists a general model that can represent all point processes. Our work bridges this gap. Focusing on the widely used event intensity function representation of point processes, we provide a proof that a class of learnable functions can universally approximate any valid intensity function. The proof connects the well known Stone-Weierstrass Theorem for function approximation, the uniform density of non-negative continuous functions using a transfer functions, the formulation of the parameters of a piece-wise continuous functions as a dynamic system, and a recurrent neural network implementation for capturing the dynamics. Using these insights, we design and implement UNIPoint, a novel neural point process model, using recurrent neural networks to parameterise sums of basis function upon each event. Evaluations on synthetic and real world datasets show that this simpler representation performs better than Hawkes process variants and more complex neural network-based approaches. We expect this result will provide a practical basis for selecting and tuning models, as well as furthering theoretical work on representational complexity and learnability. ER -