Learn to Grow: A Continual Structure Learning Framework for Overcoming Catastrophic Forgetting
暂无分享,去创建一个
Richard Socher | Caiming Xiong | Tianfu Wu | Yingbo Zhou | Xilai Li | R. Socher | Caiming Xiong | Yingbo Zhou | Tianfu Wu | Xilai Li
[1] Wei Chen,et al. SupportNet: solving catastrophic forgetting in class incremental learning with support data , 2018, ArXiv.
[2] Christoph H. Lampert,et al. iCaRL: Incremental Classifier and Representation Learning , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[3] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[4] Chrisantha Fernando,et al. PathNet: Evolution Channels Gradient Descent in Super Neural Networks , 2017, ArXiv.
[5] Risto Miikkulainen,et al. Evolving Neural Networks through Augmenting Topologies , 2002, Evolutionary Computation.
[6] Razvan Pascanu,et al. Overcoming catastrophic forgetting in neural networks , 2016, Proceedings of the National Academy of Sciences.
[7] Jiwon Kim,et al. Continual Learning with Deep Generative Replay , 2017, NIPS.
[8] Ramesh Raskar,et al. Accelerating Neural Architecture Search using Performance Prediction , 2017, ICLR.
[9] Sebastian Thrun,et al. Lifelong robot learning , 1993, Robotics Auton. Syst..
[10] Michael McCloskey,et al. Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem , 1989 .
[11] Yiming Yang,et al. DARTS: Differentiable Architecture Search , 2018, ICLR.
[12] Alexandros Karatzoglou,et al. Overcoming Catastrophic Forgetting with Hard Attention to the Task , 2018 .
[13] Surya Ganguli,et al. Continual Learning Through Synaptic Intelligence , 2017, ICML.
[14] Yee Whye Teh,et al. Progress & Compress: A scalable framework for continual learning , 2018, ICML.
[15] Svetlana Lazebnik,et al. Piggyback: Adapting a Single Network to Multiple Tasks by Learning to Mask Weights , 2018, ECCV.
[16] Sung Ju Hwang,et al. Lifelong Learning with Dynamically Expandable Networks , 2017, ICLR.
[17] Andrea Vedaldi,et al. Efficient Parametrization of Multi-domain Deep Neural Networks , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[18] Geoffrey E. Hinton,et al. ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.
[19] Richard E. Turner,et al. Variational Continual Learning , 2017, ICLR.
[20] Anthony V. Robins,et al. Catastrophic Forgetting, Rehearsal and Pseudorehearsal , 1995, Connect. Sci..
[21] Svetlana Lazebnik,et al. Piggyback: Adding Multiple Tasks to a Single, Fixed Network by Learning to Mask , 2018, ArXiv.
[22] Barbara Caputo,et al. Adding New Tasks to a Single Network with Weight Trasformations using Binary Masks , 2018, ECCV Workshops.
[23] R Ratcliff,et al. Connectionist models of recognition memory: constraints imposed by learning and forgetting functions. , 1990, Psychological review.
[24] Yann LeCun,et al. The mnist database of handwritten digits , 2005 .
[25] Silvio Savarese,et al. Active Learning for Convolutional Neural Networks: A Core-Set Approach , 2017, ICLR.
[26] Yarin Gal,et al. Towards Robust Evaluations of Continual Learning , 2018, ArXiv.
[27] Andrea Vedaldi,et al. Learning multiple visual domains with residual adapters , 2017, NIPS.
[28] Quoc V. Le,et al. Neural Architecture Search with Reinforcement Learning , 2016, ICLR.
[29] Marc'Aurelio Ranzato,et al. Gradient Episodic Memory for Continual Learning , 2017, NIPS.
[30] Byoung-Tak Zhang,et al. Overcoming Catastrophic Forgetting by Incremental Moment Matching , 2017, NIPS.