Quantum Boosting using Domain-Partitioning Hypotheses

Boosting is an ensemble learning method that converts a weak learner into a strong learner in the PAC learning framework. Freund and Schapire gave the first classical boosting algorithm for binary hypothesis known as AdaBoost, and this was recently adapted into a quantum boosting algorithm by Arunachalam et al. Their quantum boosting algorithm (which we refer to as Q-AdaBoost) is quadratically faster than the classical version in terms of the VC-dimension of the hypothesis class of the weak learner but polynomially worse in the bias of the weak learner. In this work we design a different quantum boosting algorithm that uses domain partitioning hypotheses that are significantly more flexible than those used in prior quantum boosting algorithms in terms of margin calculations. Our algorithm Q-RealBoost is inspired by the “Real AdaBoost” (aka. RealBoost) extension to the original AdaBoost algorithm. Further, we show that Q-RealBoost provides a polynomial speedup over Q-AdaBoost in terms of both the bias of the weak learner and the time taken by the weak learner to learn the target concept class. 2012 ACM Subject Classification Hardware → Quantum computation; Theory of computation → Machine learning theory

[1]  Krysta Marie Svore,et al.  Quantum Speed-Ups for Solving Semidefinite Programs , 2017, 2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS).

[2]  Hiroshi Nagahashi,et al.  Penalized AdaBoost: Improving the Generalization Error of Gentle AdaBoost through a Margin Distribution , 2015, IEICE Trans. Inf. Syst..

[3]  Dmitry Gavinsky Optimally-Smooth Adaptive Boosting and Application to Agnostic Learning , 2003, J. Mach. Learn. Res..

[4]  Rocco A. Servedio,et al.  Smooth Boosting and Learning with Malicious Noise , 2001, J. Mach. Learn. Res..

[5]  D. Opitz,et al.  Popular Ensemble Methods: An Empirical Study , 1999, J. Artif. Intell. Res..

[6]  Qiuqi Ruan,et al.  Real Adaboost feature selection for Face Recognition , 2010, IEEE 10th INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING PROCEEDINGS.

[7]  Maria Schuld,et al.  Quantum ensembles of quantum classifiers , 2017, Scientific Reports.

[8]  Manfred K. Warmuth,et al.  The Weighted Majority Algorithm , 1994, Inf. Comput..

[9]  Yoram Singer,et al.  Improved Boosting Algorithms Using Confidence-rated Predictions , 1998, COLT' 98.

[10]  Gunnar Rätsch,et al.  An Introduction to Boosting and Leveraging , 2002, Machine Learning Summer School.

[11]  Ronald de Wolf,et al.  Improved Quantum Boosting , 2020, ArXiv.

[12]  Miklos Santha,et al.  Quantum algorithms for hedging and the learning of Ising models , 2021 .

[13]  Eric Bauer,et al.  An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants , 1999, Machine Learning.

[14]  Srinivasan Arunachalam,et al.  On the Robustness of Bucket Brigade Quantum RAM , 2015, TQC.

[15]  Peter Wittek,et al.  Quantum Machine Learning: What Quantum Computing Means to Data Mining , 2014 .

[16]  G. Brassard,et al.  Quantum Amplitude Amplification and Estimation , 2000, quant-ph/0005055.

[17]  Pedro M. Domingos,et al.  Tree Induction for Probability-Based Ranking , 2003, Machine Learning.

[18]  Ronald de Wolf,et al.  Guest Column: A Survey of Quantum Learning Theory , 2017, SIGA.

[19]  Nader H. Bshouty,et al.  Learning DNF over the uniform distribution using a quantum example oracle , 1995, COLT '95.

[20]  F. Petruccione,et al.  An introduction to quantum machine learning , 2014, Contemporary Physics.

[21]  Corinna Cortes,et al.  Boosting Decision Trees , 1995, NIPS.

[22]  Andris Ambainis Quantum search with variable times , 2008, STACS.

[23]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.

[24]  Srinivasan Arunachalam,et al.  Quantum Boosting , 2020, ICML.

[25]  Ronald de Wolf,et al.  Optimal Quantum Sample Complexity of Learning Algorithms , 2016, CCC.

[26]  Hiroshi Nagahashi,et al.  Parameterized AdaBoost: Introducing a Parameter to Speed Up the Training of Real AdaBoost , 2014, IEEE Signal Processing Letters.

[27]  Joran van Apeldoorn Quantum Probability Oracles & Multidimensional Amplitude Estimation , 2021, TQC.

[28]  Yoav Freund,et al.  Boosting a weak learning algorithm by majority , 1995, COLT '90.

[29]  Ximing Wang,et al.  Quantum speedup in adaptive boosting of binary classification , 2019, Science China Physics, Mechanics & Astronomy.

[30]  Bo Wu,et al.  Fast rotation invariant multi-view face detection based on real Adaboost , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[31]  András Gilyén,et al.  Improvements in Quantum SDP-Solving with Applications , 2018, ICALP.

[32]  Adam Tauman Kalai,et al.  On agnostic boosting and parity learning , 2008, STOC.

[33]  Hartmut Neven,et al.  QBoost: Large Scale Classifier Training with Adiabatic Quantum Optimization , 2012, ACML.

[34]  Dmitry Gavinsky,et al.  On Boosting with Polynomially Bounded Distributions , 2002, J. Mach. Learn. Res..

[35]  Peter Clark,et al.  The CN2 Induction Algorithm , 1989, Machine Learning.

[36]  Shai Ben-David,et al.  Agnostic Boosting , 2001, COLT/EuroCOLT.

[37]  Yoav Freund,et al.  Boosting a weak learning algorithm by majority , 1990, COLT '90.

[38]  Han Wang,et al.  2D staircase detection using real AdaBoost , 2009, 2009 7th International Conference on Information, Communications and Signal Processing (ICICS).

[39]  Thomas G. Dietterich An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization , 2000, Machine Learning.

[40]  Alex M. Andrew,et al.  Boosting: Foundations and Algorithms , 2012 .

[41]  Ievgeniia Oshurko Quantum Machine Learning , 2020, Quantum Computing.

[42]  R. Schapire The Strength of Weak Learnability , 1990, Machine Learning.

[43]  Alexander Vezhnevets,et al.  ‘ Modest AdaBoost ’ – Teaching AdaBoost to Generalize Better , 2005 .