Learning a performance metric of Buchberger's algorithm

What can be (machine) learned about the complexity of Buchberger’s algorithm? Given a system of polynomials, Buchberger’s algorithm computes a Gröbner basis of the ideal these polynomials generate using an iterative procedure based on multivariate long division. The runtime of each step of the algorithm is typically dominated by a series of polynomial additions, and the total number of these additions is a hardware independent performance metric that is often used to evaluate and optimize various implementation choices. In this work we attempt to predict, using just the starting input, the number of polynomial additions that take place during one run of Buchberger’s algorithm. Good predictions are useful for quickly estimating difficulty and understanding what features make Gröbner basis computation hard. Our features and methods could also be used for value models in the reinforcement learning approach to optimize Buchberger’s algorithm introduced in [21]. We show that a multiple linear regression model built from a set of easy-to-compute ideal generator statistics can predict the number of polynomial additions somewhat well, better than an uninformed model, and better than regression models built on some intuitive commutative algebra invariants that are more difficult to compute. We also train a simple recursive neural network that outperforms these linear models. Our work serves as a proof of concept, demonstrating that predicting the number of polynomial additions in Buchberger’s algorithm is a feasible problem from the point of view of machine learning.

[1]  B. Sturmfels,et al.  Algebraic Algorithms for Sampling from Conditional Distributions Eye Color Black Brunette Red Blonde Total , 2022 .

[2]  Michael Stillman,et al.  Learning selection strategies in Buchberger's algorithm , 2020, ICML.

[3]  Volker Strassen,et al.  A Fast Monte-Carlo Test for Primality , 1977, SIAM J. Comput..

[4]  Guillaume Lample,et al.  Deep Learning for Symbolic Mathematics , 2019, ICLR.

[5]  David A. Cox,et al.  Ideals, Varieties, and Algorithms: An Introduction to Computational Algebraic Geometry and Commutative Algebra, 3/e (Undergraduate Texts in Mathematics) , 2007 .

[6]  Jesús A. De Loera,et al.  Short rational functions for toric algebra and applications , 2004, J. Symb. Comput..

[7]  David Naccache,et al.  Gröbner Basis , 2011, Encyclopedia of Cryptography and Security.

[8]  Jesús A. De Loera,et al.  Random sampling in computational algebra: Helly numbers and violator spaces , 2015, J. Symb. Comput..

[9]  Carlos Beltrán,et al.  On Smale's 17th Problem: A Probabilistic Positive Solution , 2008, Found. Comput. Math..

[10]  L. M. Pardo,et al.  Smale’s 17th problem: Average polynomial time to compute affine and projective solutions , 2008 .

[11]  J. D. Loera,et al.  Random monomial ideals , 2017, Journal of Algebra.

[12]  Thomas Dubé,et al.  The Structure of Polynomial Ideals and Gröbner Bases , 2013, SIAM J. Comput..

[13]  B. Sturmfels Gröbner bases and convex polytopes , 1995 .

[14]  Jean-Charles Faugère,et al.  Efficient Computation of Zero-Dimensional Gröbner Bases by Change of Ordering , 1993, J. Symb. Comput..

[15]  Jean-Charles Faugère,et al.  Sparse Gröbner bases: the unmixed case , 2014, ISSAC.

[16]  Yang-Hui He,et al.  Machine-Learning Mathematical Structures , 2021, International Journal of Data Science in the Mathematical Sciences.

[17]  Average behavior of minimal free resolutions of monomial ideals , 2018, Proceedings of the American Mathematical Society.

[18]  Bernd Sturmfels,et al.  Learning algebraic varieties from samples , 2018, Revista Matemática Complutense.

[19]  Shang-Hua Teng,et al.  Smoothed analysis of algorithms: why the simplex algorithm usually takes polynomial time , 2001, STOC '01.