Adaptive Boosting for Transfer Learning Using Dynamic Updates

Instance-based transfer learning methods utilize labeled examples from one domain to improve learning performance in another domain via knowledge transfer. Boosting-based transfer learning algorithms are a subset of such methods and have been applied successfully within the transfer learning community. In this paper, we address some of the weaknesses of such algorithms and extend the most popular transfer boosting algorithm, TrAdaBoost. We incorporate a dynamic factor into TrAdaBoost to make it meet its intended design of incorporating the advantages of both AdaBoost and the "Weighted Majority Algorithm". We theoretically and empirically analyze the effect of this important factor on the boosting performance of TrAdaBoost and we apply it as a "correction factor" that significantly improves the classification performance. Our experimental results on several real-world datasets demonstrate the effectiveness of our framework in obtaining better classification results.

[1]  Qiang Yang,et al.  A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.

[2]  Yang Wang,et al.  Cost-sensitive boosting for classification of imbalanced data , 2007, Pattern Recognit..

[3]  Hendrik Blockeel,et al.  Knowledge Discovery in Databases: PKDD 2003 , 2003, Lecture Notes in Computer Science.

[4]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.

[5]  Qiang Yang,et al.  Transfer Learning via Dimensionality Reduction , 2008, AAAI.

[6]  Qiang Yang,et al.  Co-clustering based classification for out-of-domain documents , 2007, KDD '07.

[7]  Eric Eaton,et al.  Selective knowledge transfer for machine learning , 2009 .

[8]  Peter Stone,et al.  Boosting for Regression Transfer , 2010, ICML.

[9]  Nitesh V. Chawla,et al.  SMOTEBoost: Improving Prediction of the Minority Class in Boosting , 2003, PKDD.

[10]  Yi Yao,et al.  Boosting for transfer learning with multiple sources , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[11]  Eric Eaton,et al.  Set-Based Boosting for Instance-Level Transfer , 2009, 2009 IEEE International Conference on Data Mining Workshops.

[12]  Manfred K. Warmuth,et al.  The weighted majority algorithm , 1989, 30th Annual Symposium on Foundations of Computer Science.

[13]  Jin Hyeong Park,et al.  Multi-resolution boosting for classification and regression problems , 2009, Knowledge and Information Systems.

[14]  Thomas G. Dietterich,et al.  Improving SVM accuracy by training on auxiliary data sources , 2004, ICML.

[15]  Hans-Peter Kriegel,et al.  Integrating structured biological data by Kernel Maximum Mean Discrepancy , 2006, ISMB.

[16]  Qiang Yang,et al.  Proceedings of the Twenty-Fourth AAAI Conference on Artificial Intelligence (AAAI-10) Adaptive Transfer Learning , 2022 .

[17]  Sethuraman Panchanathan,et al.  Cost-sensitive Boosting for Concept Drift , 2010 .

[18]  Ken Lang,et al.  NewsWeeder: Learning to Filter Netnews , 1995, ICML.

[19]  Ivor W. Tsang,et al.  Domain Adaptation via Transfer Component Analysis , 2009, IEEE Transactions on Neural Networks.

[20]  Qiang Yang,et al.  Boosting for transfer learning , 2007, ICML '07.