相关论文

A Convex Formulation for Learning Task Relationships in Multi-Task Learning

Abstract:Multi-task learning is a learning paradigm which seeks to improve the generalization performance of a learning task with the help of some other related tasks. In this paper, we propose a regularization formulation for learning the relationships between tasks in multi-task learning. This formulation can be viewed as a novel generalization of the regularization framework for single-task learning. Besides modeling positive task correlation, our method, called multi-task relationship learning (MTRL), can also describe negative task correlation and identify outlier tasks based on the same underlying principle. Under this regularization framework, the objective function of MTRL is convex. For efficiency, we use an alternating method to learn the optimal model parameters for each task as well as the relationships between tasks. We study MTRL in the symmetric multi-task learning setting and then generalize it to the asymmetric setting as well. We also study the relationships between MTRL and some existing multi-task learning methods. Experiments conducted on a toy problem as well as several benchmark data sets demonstrate the effectiveness of MTRL.

参考文献

[1]  Sebastian Thrun,et al.  Is Learning The n-th Thing Any Easier Than Learning The First? , 1995, NIPS.

[2]  Sebastian Thrun,et al.  Discovering Structure in Multiple Learning Tasks: The TC Algorithm , 1996, ICML.

[3]  Rich Caruana,et al.  Multitask Learning , 1997, Machine-mediated learning.

[4]  Stephen P. Boyd,et al.  Applications of second-order cone programming , 1998 .

[5]  A. Rukhin Matrix Variate Distributions , 1999, The Multivariate Normal Distribution.

[6]  Tom Heskes,et al.  Task Clustering and Gating for Bayesian Multitask Learning , 2003, J. Mach. Learn. Res..

[7]  S. Keerthi,et al.  SMO algorithm for least squares SVM , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[8]  Jonathan Baxter,et al.  A Bayesian/Information Theoretic Model of Learning to Learn via Multiple Task Sampling , 1997, Machine Learning.

[9]  Johan A. K. Suykens,et al.  Benchmarking Least Squares Support Vector Machine Classifiers , 2004, Machine Learning.

[10]  Massimiliano Pontil,et al.  Regularized multi--task learning , 2004, KDD.

[11]  Charles A. Micchelli,et al.  Learning Multiple Tasks with Kernel Methods , 2005, J. Mach. Learn. Res..

[12]  Mikhail Belkin,et al.  Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples , 2006, J. Mach. Learn. Res..

[13]  Rajat Raina,et al.  Constructing informative priors using transfer learning , 2006, ICML.

[14]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[15]  Christopher M. Bishop,et al.  Pattern Recognition and Machine Learning (Information Science and Statistics) , 2006 .

[16]  Kumar Chellapilla,et al.  Personalized handwriting recognition via biased regularization , 2006, ICML.

[17]  Edwin V. Bonilla,et al.  Multi-task Gaussian Process Prediction , 2007, NIPS.

[18]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[19]  Charles A. Micchelli,et al.  A Spectral Regularization Framework for Multi-Task Structure Learning , 2007, NIPS.

[20]  Masashi Sugiyama,et al.  Multi-Task Learning via Conic Programming , 2007, NIPS.

[21]  Lawrence Carin,et al.  Multi-Task Learning for Classification with Dirichlet Process Priors , 2007, J. Mach. Learn. Res..

[22]  Massimiliano Pontil,et al.  Convex multi-task feature learning , 2008, Machine Learning.

[23]  Jean-Philippe Vert,et al.  Clustered Multi-Task Learning: A Convex Formulation , 2008, NIPS.

[24]  Eric Eaton,et al.  Modeling Transfer Relationships Between Learning Tasks for Improved Inductive Transfer , 2008, ECML/PKDD.

[25]  Hal Daumé,et al.  Bayesian Multitask Learning with Latent Hierarchies , 2009, UAI.

[26]  Jieping Ye,et al.  Multi-Task Feature Learning Via Efficient l2, 1-Norm Minimization , 2009, UAI.

[27]  Qiang Yang,et al.  A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.

[28]  Dit-Yan Yeung,et al.  Multi-Task Learning using Generalized t Process , 2010, AISTATS.

引用
Multi-task learning deep neural networks for automatic speech recognition
2015
Multi-task multi-modal learning for joint diagnosis and prognosis of human cancers
Medical Image Anal.
2020
Online Federated Multitask Learning
2020
Online Federated Multitask Learning
2019 IEEE International Conference on Big Data (Big Data)
2019
Reliable Gender Prediction Based on Users’ Video Viewing Behavior
2016 IEEE 16th International Conference on Data Mining (ICDM)
2016
Multi-task proximal support vector machine
Pattern Recognit.
2015
Generalized Dictionary for Multitask Learning with Boosting
IJCAI
2016
Multi-Task Learning with Group-Specific Feature Space Sharing
ECML/PKDD
2015
Varying-coefficient models for geospatial transfer learning
Machine Learning
2017
Varying-coefficient models with isotropic Gaussian process priors
ArXiv
2015
Online Learning of Multiple Tasks and Their Relationships
AISTATS
2011
Measuring the Discrepancy between Conditional Distributions: Methods, Properties and Applications
IJCAI
2020
Robust Task Learning Based on Nonlinear Regression With Mixtures of Student-t Distributions
IEEE Access
2020
A Unified Framework for Structured Low-rank Matrix Learning
ICML
2017
A Saddle Point Approach to Structured Low-rank Matrix Learning in Large-scale Applications
ArXiv
2017
Asymmetric multi-task learning based on task relatedness and loss
ICML 2016
2016
Multitask Learning of Deep Neural Networks for Low-Resource Speech Recognition
IEEE/ACM Transactions on Audio, Speech, and Language Processing
2015
Efficient Low-Rank Stochastic Gradient Descent Methods for Solving Semidefinite Programs
AISTATS
2014
Going deeper with convolutions
2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
2014
Collaboratively Training Sentiment Classifiers for Multiple Domains
IEEE Transactions on Knowledge and Data Engineering
2017