The idea of ensemble learning is to build a prediction model by combining the strengths of a collection of simpler base models. Although they are extensively used, ensemble methods have high requirements in terms of memory and computational time. In this work, we propose a quantum algorithm that allows reproducing ensemble classification using bagging strategy. The algorithm generates many sub-samples in superposition, in such a way that only a single execution of a quantum classifier is required. In particular, the entanglement between a quantum register and different training sub-samples in superposition allows obtaining a sum of individual results which gives rise to the ensemble prediction. When considering the overall temporal cost of the algorithm, the single base classifier impacts additively rather than multiplicatively, as it usually happens in ensemble framework. Furthermore, given that the number of base models scales exponentially with the number of qubits of the control register, our algorithm opens up the possibility of exponential speed-up for quantum ensemble.
[1]
R. Cleve,et al.
Quantum fingerprinting.
,
2001,
Physical review letters.
[2]
Leo Breiman,et al.
Random Forests
,
2001,
Machine Learning.
[3]
Oleksandr Makeyev,et al.
Neural network with ensembles
,
2010,
The 2010 International Joint Conference on Neural Networks (IJCNN).
[4]
Thierry Paul,et al.
Quantum computation and quantum information
,
2007,
Mathematical Structures in Computer Science.
[5]
Thomas G. Dietterich.
Multiple Classifier Systems
,
2000,
Lecture Notes in Computer Science.
[6]
Thomas P. Minka,et al.
Bayesian model averaging is not model combination
,
2002
.
[7]
Steve Mullett,et al.
Read the fine print.
,
2009,
RN.
[8]
D. Ruppert.
The Elements of Statistical Learning: Data Mining, Inference, and Prediction
,
2004
.
[9]
Maria Schuld,et al.
Quantum ensembles of quantum classifiers
,
2017,
Scientific Reports.