Exact, Fast and Expressive Poisson Point Processes via Squared Neural Families

Authors

  • Russell Tsuchida Data61-CSIRO
  • Cheng Soon Ong Data61-CSIRO Australian National University
  • Dino Sejdinovic University of Adelaide

DOI:

https://doi.org/10.1609/aaai.v38i18.30041

Keywords:

RU: Probabilistic Inference, ML: Kernel Methods, ML: Other Foundations of Machine Learning

Abstract

We introduce squared neural Poisson point processes (SNEPPPs) by parameterising the intensity function by the squared norm of a two layer neural network. When the hidden layer is fixed and the second layer has a single neuron, our approach resembles previous uses of squared Gaussian process or kernel methods, but allowing the hidden layer to be learnt allows for additional flexibility. In many cases of interest, the integrated intensity function admits a closed form and can be computed in quadratic time in the number of hidden neurons. We enumerate a far more extensive number of such cases than has previously been discussed. Our approach is more memory and time efficient than naive implementations of squared or exponentiated kernel methods or Gaussian processes. Maximum likelihood and maximum a posteriori estimates in a reparameterisation of the final layer of the intensity function can be obtained by solving a (strongly) convex optimisation problem using projected gradient descent. We demonstrate SNEPPPs on real, and synthetic benchmarks, and provide a software implementation.

Published

2024-03-24

How to Cite

Tsuchida, R., Ong, C. S., & Sejdinovic, D. (2024). Exact, Fast and Expressive Poisson Point Processes via Squared Neural Families. Proceedings of the AAAI Conference on Artificial Intelligence, 38(18), 20559-20566. https://doi.org/10.1609/aaai.v38i18.30041

Issue

Section

AAAI Technical Track on Reasoning under Uncertainty