Gradient Boosts the Approximate Vanishing Ideal


  • Hiroshi Kera The University of Tokyo
  • Yoshihiko Hasegawa The University of Tokyo



In the last decade, the approximate vanishing ideal and its basis construction algorithms have been extensively studied in computer algebra and machine learning as a general model to reconstruct the algebraic variety on which noisy data approximately lie. In particular, the basis construction algorithms developed in machine learning are widely used in applications across many fields because of their monomial-order-free property; however, they lose many of the theoretical properties of computer-algebraic algorithms. In this paper, we propose general methods that equip monomial-order-free algorithms with several advantageous theoretical properties. Specifically, we exploit the gradient to (i) sidestep the spurious vanishing problem in polynomial time to remove symbolically trivial redundant bases, (ii) achieve consistent output with respect to the translation and scaling of input, and (iii) remove nontrivially redundant bases. The proposed methods work in a fully numerical manner, whereas existing algorithms require the awkward monomial order or exponentially costly (and mostly symbolic) computation to realize properties (i) and (iii). To our knowledge, property (ii) has not been achieved by any existing basis construction algorithm of the approximate vanishing ideal.




How to Cite

Kera, H., & Hasegawa, Y. (2020). Gradient Boosts the Approximate Vanishing Ideal. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 4428-4435.



AAAI Technical Track: Machine Learning