暂无分享,去创建一个
David Filliat | Alexander Gepperth | Andrei Stoian | Timothée Lesort | David Filliat | A. Stoian | Timothée Lesort | A. Gepperth
[1] Christoph H. Lampert,et al. iCaRL: Incremental Classifier and Representation Learning , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[2] Bogdan Raducanu,et al. Memory Replay GANs: learning to generate images from new categories without forgetting , 2018, NeurIPS.
[3] Alexander Gepperth,et al. A Bio-Inspired Incremental Learning Architecture for Applied Perceptual Problems , 2016, Cognitive Computation.
[4] Svetlana Lazebnik,et al. PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[5] Chrisantha Fernando,et al. PathNet: Evolution Channels Gradient Descent in Super Neural Networks , 2017, ArXiv.
[6] Joost van de Weijer,et al. Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting , 2018, 2018 24th International Conference on Pattern Recognition (ICPR).
[7] Yandong Guo,et al. Incremental Classifier Learning with Generative Adversarial Networks , 2018, ArXiv.
[8] Alexander Gepperth,et al. Simplified Computation and Interpretation of Fisher Matrices in Incremental Learning with Deep Neural Networks , 2019, ICANN.
[9] David Filliat,et al. Training Discriminative Models to Evaluate Generative Ones , 2019, ICANN.
[10] Jiwon Kim,et al. Continual Learning with Deep Generative Replay , 2017, NIPS.
[11] Jürgen Schmidhuber,et al. Compete to Compute , 2013, NIPS.
[12] Michael I. Jordan,et al. Advances in Neural Information Processing Systems 30 , 1995 .
[13] Marcus Rohrbach,et al. Memory Aware Synapses: Learning what (not) to forget , 2017, ECCV.
[14] Tinne Tuytelaars,et al. Expert Gate: Lifelong Learning with a Network of Experts , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[15] Roland Vollgraf,et al. Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms , 2017, ArXiv.
[16] Stefan Wermter,et al. Continual Lifelong Learning with Neural Networks: A Review , 2019, Neural Networks.
[17] Faisal Shafait,et al. Distillation Techniques for Pseudo-rehearsal Based Incremental Learning , 2018, ArXiv.
[18] Alexandros Karatzoglou,et al. Overcoming Catastrophic Forgetting with Hard Attention to the Task , 2018 .
[19] Benedikt Pfülb,et al. A comprehensive, application-oriented study of catastrophic forgetting in DNNs , 2019, ICLR.
[20] Simon Osindero,et al. Conditional Generative Adversarial Nets , 2014, ArXiv.
[21] Derek Hoiem,et al. Learning without Forgetting , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[22] Razvan Pascanu,et al. Progressive Neural Networks , 2016, ArXiv.
[23] R. French. Catastrophic forgetting in connectionist networks , 1999, Trends in Cognitive Sciences.
[24] Barbara Hammer,et al. Incremental learning algorithms and applications , 2016, ESANN.
[25] Yoshua Bengio,et al. An Empirical Investigation of Catastrophic Forgeting in Gradient-Based Neural Networks , 2013, ICLR.
[26] David Filliat,et al. Generative Models from the perspective of Continual Learning , 2018, 2019 International Joint Conference on Neural Networks (IJCNN).
[27] Hyo-Eun Kim,et al. Keep and Learn: Continual Learning by Constraining the Latent Space for Knowledge Preservation in Neural Networks , 2018, MICCAI.
[28] Ronald Kemker,et al. FearNet: Brain-Inspired Model for Incremental Learning , 2017, ICLR.
[29] Yan Liu,et al. Deep Generative Dual Memory Network for Continual Learning , 2017, ArXiv.
[30] Sung Ju Hwang,et al. Lifelong Learning with Dynamically Expandable Networks , 2017, ICLR.
[31] Benedikt Pfülb,et al. Catastrophic Forgetting: Still a Problem for DNNs , 2018, ICANN.
[32] L. Vinet,et al. A ‘missing’ family of classical orthogonal polynomials , 2010, 1011.1669.
[33] Honglak Lee,et al. Learning Structured Output Representation using Deep Conditional Generative Models , 2015, NIPS.
[34] Nitish Srivastava,et al. Improving neural networks by preventing co-adaptation of feature detectors , 2012, ArXiv.
[35] Marcus Rohrbach,et al. Selfless Sequential Learning , 2018, ICLR.
[36] Hongzhi Wang,et al. Life-long learning based on dynamic combination model , 2017, Appl. Soft Comput..
[37] Byoung-Tak Zhang,et al. Overcoming Catastrophic Forgetting by Incremental Moment Matching , 2017, NIPS.
[38] Cordelia Schmid,et al. Incremental Learning of Object Detectors without Catastrophic Forgetting , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).
[39] Yoshua Bengio,et al. Gradient-based learning applied to document recognition , 1998, Proc. IEEE.
[40] Yoshua Bengio,et al. Generative Adversarial Nets , 2014, NIPS.
[41] Razvan Pascanu,et al. Overcoming catastrophic forgetting in neural networks , 2016, Proceedings of the National Academy of Sciences.
[42] Philip H. S. Torr,et al. Riemannian Walk for Incremental Learning: Understanding Forgetting and Intransigence , 2018, ECCV.