暂无分享,去创建一个
Seyed Iman Mirzadeh | Mehrdad Farajtabar | Razvan Pascanu | Hassan Ghasemzadeh | Dilan Gorur | Seyed Iman Mirzadeh | Razvan Pascanu | Mehrdad Farajtabar | H. Ghasemzadeh | Dilan Gorur
[1] Nikos Komodakis,et al. Dynamic Few-Shot Visual Learning Without Forgetting , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[2] Arthur Jacot,et al. Neural tangent kernel: convergence and generalization in neural networks (invited paper) , 2018, NeurIPS.
[3] Marcus Rohrbach,et al. Memory Aware Synapses: Learning what (not) to forget , 2017, ECCV.
[4] Ethan Dyer,et al. Anatomy of Catastrophic Forgetting: Hidden Representations and Task Semantics , 2020, ICLR.
[5] David Barber,et al. Online Structured Laplace Approximations For Overcoming Catastrophic Forgetting , 2018, NeurIPS.
[6] Suyog Gupta,et al. To prune, or not to prune: exploring the efficacy of pruning for model compression , 2017, ICLR.
[7] Yee Whye Teh,et al. Task Agnostic Continual Learning via Meta Learning , 2019, ArXiv.
[8] Thomas L. Griffiths,et al. Reconciling meta-learning and continual learning with online mixtures of tasks , 2018, NeurIPS.
[9] Murray Shanahan,et al. Policy Consolidation for Continual Reinforcement Learning , 2019, ICML.
[10] Fred A. Hamprecht,et al. Essentially No Barriers in Neural Network Energy Landscape , 2018, ICML.
[11] Ali Farhadi,et al. Supermasks in Superposition , 2020, NeurIPS.
[12] Stefano Soatto,et al. Toward Understanding Catastrophic Forgetting in Continual Learning , 2019, ArXiv.
[13] Jeffrey L. Krichmar,et al. Attention-Based Structural-Plasticity , 2019, ArXiv.
[14] Sanjeev Arora,et al. Explaining Landscape Connectivity of Low-cost Solutions for Multilayer Nets , 2019, NeurIPS.
[15] Yoshua Bengio,et al. Online continual learning with no task boundaries , 2019, ArXiv.
[16] Ronald Kemker,et al. Measuring Catastrophic Forgetting in Neural Networks , 2017, AAAI.
[17] Jiwon Kim,et al. Continual Learning with Deep Generative Replay , 2017, NIPS.
[18] Jiashi Feng,et al. Variational Prototype Replays for Continual Learning , 2019 .
[19] Yoshua Bengio,et al. An Empirical Study of Example Forgetting during Deep Neural Network Learning , 2018, ICLR.
[20] Balaji Lakshminarayanan,et al. Deep Ensembles: A Loss Landscape Perspective , 2019, ArXiv.
[21] Razvan Pascanu,et al. Overcoming catastrophic forgetting in neural networks , 2016, Proceedings of the National Academy of Sciences.
[22] Yen-Cheng Liu,et al. Re-evaluating Continual Learning Scenarios: A Categorization and Case for Strong Baselines , 2018, ArXiv.
[23] Yee Whye Teh,et al. Progress & Compress: A scalable framework for continual learning , 2018, ICML.
[24] Gerald Tesauro,et al. Learning to Learn without Forgetting By Maximizing Transfer and Minimizing Interference , 2018, ICLR.
[25] Yarin Gal,et al. Towards Robust Evaluations of Continual Learning , 2018, ArXiv.
[26] Song Han,et al. Learning both Weights and Connections for Efficient Neural Network , 2015, NIPS.
[27] Yoshua Bengio,et al. An Empirical Investigation of Catastrophic Forgeting in Gradient-Based Neural Networks , 2013, ICLR.
[28] Xu He,et al. Overcoming Catastrophic Interference using Conceptor-Aided Backpropagation , 2018, ICLR.
[29] Mehrdad Farajtabar,et al. Orthogonal Gradient Descent for Continual Learning , 2019, AISTATS.
[30] Philip H. S. Torr,et al. Riemannian Walk for Incremental Learning: Understanding Forgetting and Intransigence , 2018, ECCV.
[31] Mehrdad Farajtabar,et al. SOLA: Continual Learning with Second-Order Loss Approximation , 2020, ArXiv.
[32] Behnam Neyshabur,et al. What is being transferred in transfer learning? , 2020, NeurIPS.
[33] Natalia Gimelshein,et al. PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.
[34] Albert Gordo,et al. Using Hindsight to Anchor Past Knowledge in Continual Learning , 2019, AAAI.
[35] Trevor Darrell,et al. Uncertainty-guided Continual Learning with Bayesian Neural Networks , 2019, ICLR.
[36] Nicolas Y. Masse,et al. Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization , 2018, Proceedings of the National Academy of Sciences.
[37] Marc'Aurelio Ranzato,et al. Efficient Lifelong Learning with A-GEM , 2018, ICLR.
[38] Sung Ju Hwang,et al. Lifelong Learning with Dynamically Expandable Networks , 2017, ICLR.
[39] Andrew Gordon Wilson,et al. Loss Surfaces, Mode Connectivity, and Fast Ensembling of DNNs , 2018, NeurIPS.
[40] Matthias De Lange,et al. Continual learning: A comparative study on how to defy forgetting in classification tasks , 2019, ArXiv.
[41] Surya Ganguli,et al. Continual Learning Through Synaptic Intelligence , 2017, ICML.
[42] Richard E. Turner,et al. Variational Continual Learning , 2017, ICLR.
[43] Marc'Aurelio Ranzato,et al. Gradient Episodic Memory for Continual Learning , 2017, NIPS.
[44] Ali Farhadi,et al. In the Wild: From ML Models to Pragmatic ML Systems , 2020, ArXiv.
[45] Byoung-Tak Zhang,et al. Overcoming Catastrophic Forgetting by Incremental Moment Matching , 2017, NIPS.
[46] Joan Bruna,et al. Spurious Valleys in One-hidden-layer Neural Network Optimization Landscapes , 2019, J. Mach. Learn. Res..
[47] Yoshua Bengio,et al. Gradient based sample selection for online continual learning , 2019, NeurIPS.
[48] Shan Yu,et al. Continual learning of context-dependent processing in neural networks , 2018, Nature Machine Intelligence.
[49] Xu He,et al. Overcoming Catastrophic Interference by Conceptors , 2017, ArXiv.
[50] Michael McCloskey,et al. Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem , 1989 .
[51] Yee Whye Teh,et al. Functional Regularisation for Continual Learning using Gaussian Processes , 2019, ICLR.
[52] Richard Socher,et al. Learn to Grow: A Continual Structure Learning Framework for Overcoming Catastrophic Forgetting , 2019, ICML.
[53] Julien Mairal,et al. On the Inductive Bias of Neural Tangent Kernels , 2019, NeurIPS.
[54] Marc'Aurelio Ranzato,et al. On Tiny Episodic Memories in Continual Learning , 2019 .
[55] Geoffrey E. Hinton,et al. Similarity of Neural Network Representations Revisited , 2019, ICML.
[56] Laurent Itti,et al. Closed-Loop GAN for continual Learning , 2018, IJCAI.
[57] Laurent Itti,et al. Closed-Loop Memory GAN for Continual Learning , 2018, IJCAI.
[58] Yee Whye Teh,et al. Continual Unsupervised Representation Learning , 2019, NeurIPS.
[59] Gintare Karolina Dziugaite,et al. Linear Mode Connectivity and the Lottery Ticket Hypothesis , 2019, ICML.
[60] Joel Lehman,et al. Learning to Continually Learn , 2020, ECAI.
[61] Hassan Ghasemzadeh,et al. Dropout as an Implicit Gating Mechanism For Continual Learning , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).
[62] Yanshuai Cao,et al. Few-Shot Self Reminder to Overcome Catastrophic Forgetting , 2018, ArXiv.
[63] Christoph H. Lampert,et al. iCaRL: Incremental Classifier and Representation Learning , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[64] David Rolnick,et al. Experience Replay for Continual Learning , 2018, NeurIPS.
[65] Anthony V. Robins,et al. Catastrophic Forgetting, Rehearsal and Pseudorehearsal , 1995, Connect. Sci..
[66] Surya Ganguli,et al. Emergent properties of the local geometry of neural loss landscapes , 2019, ArXiv.
[67] Seyed Iman Mirzadeh,et al. Understanding the Role of Training Regimes in Continual Learning , 2020, NeurIPS.
[68] Stefan Wermter,et al. Continual Lifelong Learning with Neural Networks: A Review , 2019, Neural Networks.