Determinantal point processes based on orthogonal polynomials for sampling minibatches in SGD

Stochastic gradient descent (SGD) is a cornerstone of machine learning. When the number N of data items is large, SGD relies on constructing an unbiased estimator of the gradient of the empirical risk using a small subset of the original dataset, called a minibatch. Default minibatch construction involves uniformly sampling a subset of the desired size, but alternatives have been explored for variance reduction. In particular, experimental evidence suggests drawing minibatches from determinantal point processes (DPPs), tractable distributions over minibatches that favour diversity among selected items. However, like in recent work on DPPs for coresets, providing a systematic and principled understanding of how and why DPPs help has been difficult. In this work, we contribute an orthogonal polynomial-based determinantal point process paradigm for performing minibatch sampling in SGD. Our approach leverages the specific data distribution at hand, which endows it with greater sensitivity and power over existing data-agnostic methods. We substantiate our method via a detailed theoretical analysis of its convergence properties, interweaving between the discrete data set and the underlying continuous domain. In particular, we show how specific DPPs and a string of controlled approximations can lead to gradient estimators with a variance that decays faster with the batchsize than under uniform sampling. Coupled with existing finite-time guarantees for SGD on convex objectives, this entails that, for a large enough batchsize and a fixed budget of item-level gradients to evaluate, DPP minibatches lead to a smaller bound on the mean square approximation error than uniform minibatches. Moreover, our estimators are amenable to a recent algorithm that directly samples linear statistics of DPPs (i.e., the gradient estimator) without sampling the underlying DPP (i.e., the minibatch), thereby reducing computational overhead. We provide detailed synthetic as well as real data experiments to substantiate our theoretical claims. Alphabetical order. Université de Lille, CNRS, Centrale Lille, UMR 9189–CRIStAL, F-59000 Lille, France (remi.bardenet@univ-lille.fr). Corresponding author. National University of Singapore, Department of Mathematics, 10 Lower Kent Ridge Road, 119076, Singapore (subhrowork@gmail.com). Corresponding author. National University of Singapore, Institute of Operations Research and Analytics, 10 Lower Kent Ridge Road, 119076, Singapore (lin meixia@u.nus.edu). 1 ar X iv :2 11 2. 06 00 7v 1 [ st at .M L ] 1 1 D ec 2 02 1

[1]  Shai Ben-David,et al.  Understanding Machine Learning: From Theory to Algorithms , 2014 .

[2]  Y. Peres,et al.  Determinantal Processes and Independence , 2005, math/0503110.

[3]  Mohit Singh,et al.  Proportional Volume Sampling and Approximation Algorithms for A-Optimal Design , 2018, SODA.

[4]  Cheng Zhang,et al.  Active Mini-Batch Sampling using Repulsive Point Processes , 2018, AAAI.

[5]  H. Robbins A Stochastic Approximation Method , 1951 .

[6]  O. Macchi The coincidence approach to stochastic point processes , 1975, Advances in Applied Probability.

[7]  Matthew P. Wand,et al.  Kernel Smoothing , 1995 .

[8]  Subhro Ghosh Determinantal processes and completeness of random exponentials: the critical case , 2012, 1211.2435.

[9]  Alexander Soshnikov Gaussian limit for determinantal random point fields , 2000 .

[10]  Pierre Chainais,et al.  A determinantal point process for column subset selection , 2018, J. Mach. Learn. Res..

[11]  Santosh S. Vempala,et al.  The Random Projection Method , 2005, DIMACS Series in Discrete Mathematics and Theoretical Computer Science.

[12]  Eric Moulines,et al.  Non-Asymptotic Analysis of Stochastic Approximation Algorithms for Machine Learning , 2011, NIPS.

[13]  Arnaud Poinas,et al.  On proportional volume sampling for experimental design in general spaces , 2020 .

[14]  Pierre Priouret,et al.  Adaptive Algorithms and Stochastic Approximations , 1990, Applications of Mathematics.

[15]  Ben Taskar,et al.  Determinantal Point Processes for Machine Learning , 2012, Found. Trends Mach. Learn..

[16]  A. Hardy,et al.  Monte Carlo with determinantal point processes , 2016, The Annals of Applied Probability.

[17]  Hoon Kim,et al.  Monte Carlo Statistical Methods , 2000, Technometrics.

[18]  Richard Nickl,et al.  Uniform central limit theorems for kernel density estimators , 2008 .

[19]  Hedvig Kjellstrom,et al.  Determinantal Point Processes for Mini-Batch Diversification , 2017, UAI 2017.

[20]  A. W. van der Vaart,et al.  Uniform Central Limit Theorems , 2001 .

[21]  Barry Simon,et al.  The Christoffel-Darboux Kernel , 2008, 0806.1528.

[22]  P. Rigollet,et al.  Gaussian determinantal processes: A new model for directionality in data , 2020, Proceedings of the National Academy of Sciences.

[23]  Ulrike Goldschmidt,et al.  An Introduction To The Theory Of Point Processes , 2016 .

[24]  Pierre-Olivier Amblard,et al.  Determinantal Point Processes for Coresets , 2018, J. Mach. Learn. Res..

[25]  Mark W. Schmidt,et al.  Accelerated training of conditional random fields with stochastic gradient methods , 2006, ICML.

[26]  A. Soshnikov Determinantal random point fields , 2000, math/0002099.

[27]  R'emi Bardenet,et al.  Learning from DPPs via Sampling: Beyond HKPV and symmetry , 2020, ArXiv.

[28]  J. Møller,et al.  Determinantal point process models and statistical inference , 2012, 1205.4818.

[29]  Michael W. Mahoney,et al.  Determinantal Point Processes in Randomized Numerical Linear Algebra , 2020, Notices of the American Mathematical Society.

[30]  Manfred K. Warmuth,et al.  Unbiased estimates for linear regression via volume sampling , 2017, NIPS.

[31]  Michal Valko,et al.  DPPy: Sampling Determinantal Point Processes with Python , 2018, ArXiv.

[32]  Pierre Chainais,et al.  Kernel interpolation with continuous volume sampling , 2020, ICML.

[33]  Jennifer Gillenwater Approximate inference for determinantal point processes , 2014 .