A quantum-inspired classical algorithm for recommendation systems

We give a classical analogue to Kerenidis and Prakash’s quantum recommendation system, previously believed to be one of the strongest candidates for provably exponential speedups in quantum machine learning. Our main result is an algorithm that, given an m × n matrix in a data structure supporting certain ℓ2-norm sampling operations, outputs an ℓ2-norm sample from a rank-k approximation of that matrix in time O(poly(k)log(mn)), only polynomially slower than the quantum algorithm. As a consequence, Kerenidis and Prakash’s algorithm does not in fact give an exponential speedup over classical algorithms. Further, under strong input assumptions, the classical recommendation system resulting from our algorithm produces recommendations exponentially faster than previous classical systems, which run in time linear in m and n. The main insight of this work is the use of simple routines to manipulate ℓ2-norm sampling distributions, which play the role of quantum superpositions in the classical setting. This correspondence indicates a potentially fruitful framework for formally comparing quantum machine learning algorithms to classical machine learning algorithms.

[1]  Nathan Srebro,et al.  Beating SGD: Learning SVMs in Sublinear Time , 2011, NIPS.

[2]  G. Stewart,et al.  New perturbation analyses for the Cholesky factorization , 1996 .

[3]  S. Lloyd,et al.  Quantum principal component analysis , 2013, Nature Physics.

[4]  Anna R. Karlin,et al.  Spectral analysis of data , 2001, STOC '01.

[5]  Ravi Kumar,et al.  Recommendation systems: a probabilistic analysis , 1998, Proceedings 39th Annual Symposium on Foundations of Computer Science (Cat. No.98CB36280).

[6]  A. Prakash,et al.  Quantum gradient descent for linear systems and least squares , 2017, Physical Review A.

[7]  Santosh Vempala,et al.  Randomized algorithms in numerical linear algebra , 2017, Acta Numerica.

[8]  Jon M. Kleinberg,et al.  Using mixture models for collaborative filtering , 2004, STOC '04.

[9]  James Bennett,et al.  The Netflix Prize , 2007 .

[10]  S. Muthukrishnan,et al.  Relative-Error CUR Matrix Decompositions , 2007, SIAM J. Matrix Anal. Appl..

[11]  Nathan Wiebe,et al.  Quantum singular value transformation and beyond: exponential improvements for quantum matrix arithmetics , 2018, STOC.

[12]  Petros Drineas,et al.  Fast Monte Carlo Algorithms for Matrices I: Approximating Matrix Multiplication , 2006, SIAM J. Comput..

[13]  John Preskill,et al.  Quantum Computing in the NISQ era and beyond , 2018, Quantum.

[14]  Nader H. Bshouty,et al.  Learning DNF over the uniform distribution using a quantum example oracle , 1995, COLT '95.

[15]  Benjamin Recht,et al.  A Simpler Approach to Matrix Completion , 2009, J. Mach. Learn. Res..

[16]  Santosh S. Vempala,et al.  Adaptive Sampling and Fast Low-Rank Matrix Approximation , 2006, APPROX-RANDOM.

[17]  Prabhakar Raghavan,et al.  Competitive recommendation systems , 2002, STOC '02.

[18]  Alan M. Frieze,et al.  Fast monte-carlo algorithms for finding low-rank approximations , 2004, JACM.

[19]  S. Aaronson Read the fine print , 2015, Nature Physics.

[20]  Yehuda Koren,et al.  Lessons from the Netflix prize challenge , 2007, SKDD.

[21]  Dimitris Achlioptas,et al.  Fast computation of low-rank matrix approximations , 2007, JACM.

[22]  Iordanis Kerenidis,et al.  Quantum Recommendation Systems , 2016, ITCS.

[23]  Ravi Kumar,et al.  Recommendation Systems , 2001 .

[24]  David P. Woodruff,et al.  Sublinear Time Orthogonal Tensor Decomposition , 2016, NIPS.

[25]  S. Lloyd,et al.  Quantum algorithms for supervised and unsupervised machine learning , 2013, 1307.0411.

[26]  Yehuda Koren,et al.  Matrix Factorization Techniques for Recommender Systems , 2009, Computer.

[27]  Gilles Brassard,et al.  Strengths and Weaknesses of Quantum Computing , 1997, SIAM J. Comput..

[28]  A. Harrow,et al.  Quantum algorithm for linear systems of equations. , 2008, Physical review letters.

[29]  Boaz Patt-Shamir,et al.  Improved recommendation systems , 2005, SODA '05.