A Simple and Efficient Tensor Calculus

Authors

  • Sören Laue Friedrich-Schiller-University Jena
  • Matthias Mitterreiter Friedrich-Schiller-University Jena
  • Joachim Giesen Friedrich-Schiller-University Jena

DOI:

https://doi.org/10.1609/aaai.v34i04.5881

Abstract

Computing derivatives of tensor expressions, also known as tensor calculus, is a fundamental task in machine learning. A key concern is the efficiency of evaluating the expressions and their derivatives that hinges on the representation of these expressions. Recently, an algorithm for computing higher order derivatives of tensor expressions like Jacobians or Hessians has been introduced that is a few orders of magnitude faster than previous state-of-the-art approaches. Unfortunately, the approach is based on Ricci notation and hence cannot be incorporated into automatic differentiation frameworks like TensorFlow, PyTorch, autograd, or JAX that use the simpler Einstein notation. This leaves two options, to either change the underlying tensor representation in these frameworks or to develop a new, provably correct algorithm based on Einstein notation. Obviously, the first option is impractical. Hence, we pursue the second option. Here, we show that using Ricci notation is not necessary for an efficient tensor calculus and develop an equally efficient method for the simpler Einstein notation. It turns out that turning to Einstein notation enables further improvements that lead to even better efficiency.

Downloads

Published

2020-04-03

How to Cite

Laue, S., Mitterreiter, M., & Giesen, J. (2020). A Simple and Efficient Tensor Calculus. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 4527-4534. https://doi.org/10.1609/aaai.v34i04.5881

Issue

Section

AAAI Technical Track: Machine Learning