暂无分享,去创建一个
[1] Samuel L. Smith,et al. Batch Normalization Biases Residual Blocks Towards the Identity Function in Deep Networks , 2020, NeurIPS.
[2] Dianhai Yu,et al. Multi-Task Learning for Multiple Language Translation , 2015, ACL.
[3] Senén Barro,et al. Do we need hundreds of classifiers to solve real world classification problems? , 2014, J. Mach. Learn. Res..
[4] Patrick Kidger,et al. Universal Approximation with Deep Narrow Networks , 2019, COLT 2019.
[5] Xiaodong Liu,et al. Representation Learning Using Multi-Task Deep Neural Networks for Semantic Classification and Information Retrieval , 2015, NAACL.
[6] Liwei Wang,et al. The Expressive Power of Neural Networks: A View from the Width , 2017, NIPS.
[7] Jasha Droppo,et al. Multi-task learning in deep neural networks for improved phoneme recognition , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.
[8] Hal Daumé,et al. Learning Task Grouping and Overlap in Multi-task Learning , 2012, ICML.
[9] Xiaoou Tang,et al. Facial Landmark Detection by Deep Multi-task Learning , 2014, ECCV.
[10] Lihong Li,et al. PAC-inspired Option Discovery in Lifelong Reinforcement Learning , 2014, ICML.
[11] Yee Whye Teh,et al. Conditional Neural Processes , 2018, ICML.
[12] Andrea Vedaldi,et al. Universal representations: The missing link between faces, text, planktons, and cat breeds , 2017, ArXiv.
[13] Quoc V. Le,et al. Multi-task Sequence to Sequence Learning , 2015, ICLR.
[14] Ji Wu,et al. Rapid adaptation for deep neural networks through multi-task learning , 2015, INTERSPEECH.
[15] Katja Hofmann,et al. Fast Context Adaptation via Meta-Learning , 2018, ICML.
[16] Elliot Meyerson,et al. Beyond Shared Hierarchies: Deep Multitask Learning through Soft Layer Ordering , 2017, ICLR.
[17] Gaël Varoquaux,et al. Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..
[18] C A Nelson,et al. Learning to Learn , 2017, Encyclopedia of Machine Learning and Data Mining.
[19] Geoffrey E. Hinton,et al. Visualizing Data using t-SNE , 2008 .
[20] Elliot Meyerson,et al. Modular Universal Reparameterization: Deep Multi-task Learning Across Diverse Domains , 2019, NeurIPS.
[21] M. M. Hassan Mahmud,et al. On universal transfer learning , 2007, Theor. Comput. Sci..
[22] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[23] Samuel L. Smith,et al. Batch Normalization Biases Deep Residual Networks Towards Shallow Paths , 2020, ArXiv.
[24] Yee Whye Teh,et al. Distral: Robust multitask reinforcement learning , 2017, NIPS.
[25] Rama Chellappa,et al. HyperFace: A Deep Multi-Task Learning Framework for Face Detection, Landmark Localization, Pose Estimation, and Gender Recognition , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[26] Yongxin Yang,et al. A Unified Perspective on Multi-Domain and Multi-Task Learning , 2014, ICLR.
[27] Karl Pearson F.R.S.. LIII. On lines and planes of closest fit to systems of points in space , 1901 .
[28] David A. McAllester,et al. A PAC-Bayesian Approach to Spectrally-Normalized Margin Bounds for Neural Networks , 2017, ICLR.
[29] Yoshimasa Tsuruoka,et al. A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks , 2016, EMNLP.
[30] Luca Antiga,et al. Automatic differentiation in PyTorch , 2017 .
[31] Martial Hebert,et al. Cross-Stitch Networks for Multi-task Learning , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[32] Sepp Hochreiter,et al. Self-Normalizing Neural Networks , 2017, NIPS.
[33] Massimiliano Pontil,et al. Convex multi-task feature learning , 2008, Machine Learning.
[34] Andrea Vedaldi,et al. Learning multiple visual domains with residual adapters , 2017, NIPS.
[35] Aaron C. Courville,et al. FiLM: Visual Reasoning with a General Conditioning Layer , 2017, AAAI.
[36] Lukasz Kaiser,et al. One Model To Learn Them All , 2017, ArXiv.
[37] Peter L. Bartlett,et al. Rademacher and Gaussian Complexities: Risk Bounds and Structural Results , 2003, J. Mach. Learn. Res..
[38] Tom Schaul,et al. Reinforcement Learning with Unsupervised Auxiliary Tasks , 2016, ICLR.
[39] B. Frey,et al. Predicting the sequence specificities of DNA- and RNA-binding proteins by deep learning , 2015, Nature Biotechnology.
[40] Alex Krizhevsky,et al. Learning Multiple Layers of Features from Tiny Images , 2009 .
[41] M. M. Hassan Mahmud,et al. Transfer Learning using Kolmogorov Complexity: Basic Theory and Empirical Evaluations , 2007, NIPS.
[42] Dean Collins,et al. Updating Australia ’ s high-quality annual temperature dataset , 2003 .
[43] Jason Weston,et al. A unified architecture for natural language processing: deep neural networks with multitask learning , 2008, ICML '08.
[44] Joseph J. Lim,et al. Multimodal Model-Agnostic Meta-Learning via Task-Aware Modulation , 2019, NeurIPS.
[45] George Cybenko,et al. Approximation by superpositions of a sigmoidal function , 1989, Math. Control. Signals Syst..
[46] Sergey Levine,et al. Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks , 2017, ICML.
[47] Wojciech Czarnecki,et al. On Loss Functions for Deep Neural Networks in Classification , 2017, ArXiv.
[48] Yifan Gong,et al. Cross-language knowledge transfer using multilingual deep neural network with shared hidden layers , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.
[49] Teuvo Kohonen,et al. The self-organizing map , 1990, Neurocomputing.
[50] Kristen Grauman,et al. Learning with Whom to Share in Multi-task Feature Learning , 2011, ICML.
[51] Michael L. Littman,et al. State Abstractions for Lifelong Reinforcement Learning , 2018, ICML.