Overview of accurate coresets

A coreset of an input set is its small summarization, such that solving a problem on the coreset as its input, provably yields the same result as solving the same problem on the original (full) set, for a given family of problems (models/classifiers/loss functions). Coresets have been suggested for many fundamental problems, for example, in machine/deep learning, computer vision, databases, and theoretical computer science. This introductory paper was written following requests regarding the many inconsistent coreset definitions, lack of source code, the required deep theoretical background from different fields, and the dense papers that make it hard for beginners to apply and develop coresets. The article provides folklore, classic, and simple results including step‐by‐step proofs and figures, for the simplest (accurate) coresets. Nevertheless, we did not find most of their constructions in the literature. Moreover, we expect that putting them together in a retrospective context would help the reader to grasp current results that usually generalize these fundamental observations. Experts might appreciate the unified notation and comparison table for existing results. Open source code is provided for all presented algorithms, to demonstrate their usage, and to support the readers who are more familiar with programming than mathematics.

[1]  Dan Feldman,et al.  A PTAS for k-means clustering based on weak coresets , 2007, SCG '07.

[2]  Dan Feldman,et al.  Turning big data into tiny data: Constant-size coresets for k-means, PCA and projective clustering , 2013, SODA.

[3]  Vladimir Braverman,et al.  New Frameworks for Offline and Streaming Coreset Constructions , 2016, ArXiv.

[4]  Amos Fiat,et al.  Coresets forWeighted Facilities and Their Applications , 2006, 2006 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS'06).

[5]  Michael B. Cohen,et al.  Dimensionality Reduction for k-Means Clustering and Low Rank Approximation , 2014, STOC.

[6]  David P. Woodruff,et al.  Optimal Deterministic Coresets for Ridge Regression , 2020, AISTATS.

[7]  Kenneth L. Clarkson,et al.  Smaller core-sets for balls , 2003, SODA '03.

[8]  Anirban Dasgupta,et al.  On Coresets For Regularized Regression , 2020, ICML.

[9]  Stanislav Minsker Geometric median and robust estimation in Banach spaces , 2013, 1308.1334.

[10]  Michael Langberg,et al.  Universal epsilon-approximators for Integrals , 2010, ACM-SIAM Symposium on Discrete Algorithms.

[11]  J. Copas Regression, Prediction and Shrinkage , 1983 .

[12]  Artem Barger,et al.  Deterministic Coresets for k-Means of Big Sparse Data † , 2020, Algorithms.

[13]  Ibrahim Jubran,et al.  Autonomous Toy Drone via Coresets for Pose Estimation , 2020, Sensors.

[14]  W. B. Johnson,et al.  Extensions of Lipschitz mappings into Hilbert space , 1984 .

[15]  Ibrahim Jubran,et al.  Provable Approximations for Constrained $\ell_p$ Regression , 2019, ArXiv.

[16]  Joel A. Tropp,et al.  An Introduction to Matrix Concentration Inequalities , 2015, Found. Trends Mach. Learn..

[17]  Kasturi R. Varadarajan,et al.  Geometric Approximation via Coresets , 2007 .

[18]  Alan M. Frieze,et al.  Fast monte-carlo algorithms for finding low-rank approximations , 2004, JACM.

[19]  David P. Woodruff,et al.  Optimal Approximate Matrix Product in Terms of Stable Rank , 2015, ICALP.

[20]  Yingyu Liang,et al.  Distributed PCA and k-Means Clustering , 2013 .

[21]  Richard Peng,et al.  Uniform Sampling for Matrix Approximation , 2014, ITCS.

[22]  Ke Chen,et al.  On k-Median clustering in high dimensions , 2006, SODA '06.

[23]  Paul Newman,et al.  Visual precis generation using coresets , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[24]  David Ullrich,et al.  Carathéodory’s theorem , 2008, University Lecture Series.

[25]  Dan Feldman,et al.  Coresets for Vector Summarization with Applications to Network Graphs , 2017, ICML.

[26]  A. Laub,et al.  The singular value decomposition: Its computation and some applications , 1980 .

[27]  David P. Woodruff,et al.  Regularized Weighted Low Rank Approximation , 2019, NeurIPS.

[28]  Alaa Maalouf,et al.  Tight Sensitivity Bounds For Smaller Coresets , 2019, KDD.

[29]  Aldo Porco,et al.  Low-rank approximations for predicting voting behaviour ∗ , 2015 .

[30]  H. Zou,et al.  Regularization and variable selection via the elastic net , 2005 .

[31]  Valero Laparra,et al.  Dimensionality Reduction via Regression in Hyperspectral Imagery , 2015, IEEE Journal of Selected Topics in Signal Processing.

[32]  Edo Liberty,et al.  Efficient Frequent Directions Algorithm for Sparse Matrices , 2016, KDD.

[33]  Sariel Har-Peled,et al.  On coresets for k-means and k-median clustering , 2004, STOC '04.

[34]  Dimitris Papailiopoulos,et al.  Provable deterministic leverage score sampling , 2014, KDD.

[35]  Sariel Har-Peled,et al.  Smaller Coresets for k-Median and k-Means Clustering , 2005, SCG.

[36]  Jeff M. Phillips,et al.  Coresets and Sketches , 2016, ArXiv.

[37]  Christopher Ré,et al.  Weighted SGD for ℓp Regression with Randomized Preconditioning , 2016, SODA.

[38]  Kenneth L. Clarkson,et al.  Coresets, sparse greedy approximation, and the Frank-Wolfe algorithm , 2008, SODA '08.

[39]  Dan Feldman,et al.  Dimensionality Reduction of Massive Sparse Datasets Using Coresets , 2015, NIPS.

[40]  John W. Fisher,et al.  Coresets for k-Segmentation of Streaming Data , 2014, NIPS.

[41]  Erhard Schmidt Über die Auflösung linearer Gleichungen mit Unendlich vielen unbekannten , 1908 .

[42]  Michael Langberg,et al.  A unified framework for approximating and clustering data , 2011, STOC.

[43]  Tamás Sarlós,et al.  Improved Approximation Algorithms for Large Matrices via Random Projections , 2006, 2006 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS'06).

[44]  Ibrahim Jubran,et al.  Fast and Accurate Least-Mean-Squares Solvers for High Dimensional Data , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[45]  Michael W. Mahoney,et al.  Low-distortion subspace embeddings in input-sparsity time and applications to robust linear regression , 2012, STOC '13.

[46]  Joseph S. B. Mitchell,et al.  Approximate minimum enclosing balls in high dimensions using core-sets , 2003, ACM J. Exp. Algorithmics.

[47]  Philipp Birken,et al.  Numerical Linear Algebra , 2011, Encyclopedia of Parallel Computing.

[48]  Dan Feldman,et al.  Coresets For Monotonic Functions with Applications to Deep Learning , 2018, ArXiv.

[49]  Å. Björck Solving linear least squares problems by Gram-Schmidt orthogonalization , 1967 .

[50]  Arthur E. Hoerl,et al.  Ridge Regression: Biased Estimation for Nonorthogonal Problems , 2000, Technometrics.

[51]  Huy L. Nguyen,et al.  OSNAP: Faster Numerical Linear Algebra Algorithms via Sparser Subspace Embeddings , 2012, 2013 IEEE 54th Annual Symposium on Foundations of Computer Science.

[52]  Sariel Har-Peled,et al.  Smaller Coresets for k-Median and k-Means Clustering , 2007, Discret. Comput. Geom..

[53]  David P. Woodruff,et al.  Fast approximation of matrix coherence and statistical leverage , 2011, ICML.

[54]  Zhang Yi,et al.  Robust Subspace Clustering via Thresholding Ridge Regression , 2015, AAAI.

[55]  Tamir Tassa,et al.  More Constraints, Smaller Coresets: Constrained Matrix Approximation of Sparse Big Data , 2015, KDD.

[56]  Fabrizio Grandoni,et al.  Oblivious dimension reduction for k-means: beyond subspaces and the Johnson-Lindenstrauss lemma , 2019, STOC.

[57]  S. Muthukrishnan,et al.  Relative-Error CUR Matrix Decompositions , 2007, SIAM J. Matrix Anal. Appl..

[58]  C. Carathéodory Über den Variabilitätsbereich der Koeffizienten von Potenzreihen, die gegebene Werte nicht annehmen , 1907 .

[59]  David P. Woodruff,et al.  Input Sparsity and Hardness for Robust Subspace Approximation , 2015, 2015 IEEE 56th Annual Symposium on Foundations of Computer Science.

[60]  Jung Kyomin,et al.  Scalable Kernel k-Means via Centroid Approximation , 2011, NIPS 2011.

[61]  Dan Feldman,et al.  Core‐sets: An updated survey , 2019, WIREs Data Mining Knowl. Discov..

[62]  Anirban Dasgupta,et al.  Sampling algorithms and coresets for ℓp regression , 2007, SODA '08.

[63]  David P. Woodruff,et al.  Frequent Directions: Simple and Deterministic Matrix Sketching , 2015, SIAM J. Comput..

[64]  David P. Woodruff Sketching as a Tool for Numerical Linear Algebra , 2014, Found. Trends Theor. Comput. Sci..

[65]  Mary Inaba,et al.  Applications of weighted Voronoi diagrams and randomization to variance-based k-clustering: (extended abstract) , 1994, SCG '94.

[66]  Ibrahim Jubran,et al.  Aligning Points to Lines: Provable Approximations , 2018 .

[67]  Matthieu Lerasle,et al.  ROBUST MACHINE LEARNING BY MEDIAN-OF-MEANS: THEORY AND PRACTICE , 2019 .

[68]  Lawrence Carin,et al.  Cross-Spectral Factor Analysis , 2017, NIPS.

[69]  Robert H. Halstead,et al.  Matrix Computations , 2011, Encyclopedia of Parallel Computing.

[70]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[71]  Ibrahim Jubran,et al.  Faster PAC Learning and Smaller Coresets via Smoothed Analysis , 2020, ArXiv.

[72]  David P. Woodruff,et al.  Low rank approximation and regression in input sparsity time , 2012, STOC '13.

[73]  E. Schmidt Über die Auflösung linearer Gleichungen mit Unendlich vielen unbekannten , 1908 .

[74]  Michael B. Cohen,et al.  Input Sparsity Time Low-rank Approximation via Ridge Leverage Score Sampling , 2015, SODA.

[75]  Pierre-Olivier Amblard,et al.  Determinantal Point Processes for Coresets , 2018, J. Mach. Learn. Res..

[76]  David P. Woodruff,et al.  Coresets and sketches for high dimensional subspace approximation problems , 2010, SODA '10.