OT-Flow: Fast and Accurate Continuous Normalizing Flows via Optimal Transport


  • Derek Onken Emory University
  • Samy Wu Fung UCLA
  • Xingjian Li Emory University
  • Lars Ruthotto Emory University




Neural Generative Models & Autoencoders


A normalizing flow is an invertible mapping between an arbitrary probability distribution and a standard normal distribution; it can be used for density estimation and statistical inference. Computing the flow follows the change of variables formula and thus requires invertibility of the mapping and an efficient way to compute the determinant of its Jacobian. To satisfy these requirements, normalizing flows typically consist of carefully chosen components. Continuous normalizing flows (CNFs) are mappings obtained by solving a neural ordinary differential equation (ODE). The neural ODE's dynamics can be chosen almost arbitrarily while ensuring invertibility. Moreover, the log-determinant of the flow's Jacobian can be obtained by integrating the trace of the dynamics' Jacobian along the flow. Our proposed OT-Flow approach tackles two critical computational challenges that limit a more widespread use of CNFs. First, OT-Flow leverages optimal transport (OT) theory to regularize the CNF and enforce straight trajectories that are easier to integrate. Second, OT-Flow features exact trace computation with time complexity equal to trace estimators used in existing CNFs. On five high-dimensional density estimation and generative modeling tasks, OT-Flow performs competitively to state-of-the-art CNFs while on average requiring one-fourth of the number of weights with an 8x speedup in training time and 24x speedup in inference.




How to Cite

Onken, D., Wu Fung, S., Li, X., & Ruthotto, L. (2021). OT-Flow: Fast and Accurate Continuous Normalizing Flows via Optimal Transport. Proceedings of the AAAI Conference on Artificial Intelligence, 35(10), 9223-9232. https://doi.org/10.1609/aaai.v35i10.17113



AAAI Technical Track on Machine Learning III