Towards Training Probabilistic Topic Models on Neuromorphic Multi-Chip Systems


  • Zihao Xiao Tsinghua University
  • Jianfei Chen Tsinghua University
  • Jun Zhu Tsinghua University


probabilistic topic model, neuromorphic chip, spiking neural network


Probabilistic topic models are popular unsupervised learning methods, including probabilistic latent semantic indexing (pLSI) and latent Dirichlet allocation (LDA). By now, their training is implemented on general purpose computers (GPCs), which are flexible in programming but energy-consuming. Towards low-energy implementations, this paper investigates their training on an emerging hardware technology called the neuromorphic multi-chip systems (NMSs). NMSs are very effective for a family of algorithms called spiking neural networks (SNNs). We present three SNNs to train topic models.The first SNN is a batch algorithm combining the conventional collapsed Gibbs sampling (CGS) algorithm and an inference SNN to train LDA. The other two SNNs are online algorithms targeting at both energy- and storage-limited environments. The two online algorithms are equivalent with training LDA by using maximum-a-posterior estimation and maximizing the semi-collapsed likelihood, respectively.They use novel, tailored ordinary differential equations for stochastic optimization. We simulate the new algorithms and show that they are comparable with the GPC algorithms, while being suitable for NMS implementation. We also propose an extension to train pLSI and a method to prune the network to obey the limited fan-in of some NMSs.




How to Cite

Xiao, Z., Chen, J., & Zhu, J. (2018). Towards Training Probabilistic Topic Models on Neuromorphic Multi-Chip Systems. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). Retrieved from



AAAI Technical Track: Reasoning under Uncertainty