暂无分享,去创建一个
Christian Wachinger | Anne-Marie Rickmann | Abhijit Guha Roy | Sinan Özgür Özgün | C. Wachinger | Anne-Marie Rickmann | S. Özgün
[1] Razvan Pascanu,et al. Overcoming catastrophic forgetting in neural networks , 2016, Proceedings of the National Academy of Sciences.
[2] Konstantinos Kamnitsas,et al. Towards continual learning in medical imaging , 2018, ArXiv.
[3] Bram van Ginneken,et al. A survey on deep learning in medical image analysis , 2017, Medical Image Anal..
[4] Nick C Fox,et al. The Alzheimer's disease neuroimaging initiative (ADNI): MRI methods , 2008, Journal of magnetic resonance imaging : JMRI.
[5] Zhanxing Zhu,et al. Reinforced Continual Learning , 2018, NeurIPS.
[6] Trevor Darrell,et al. Uncertainty-guided Continual Learning with Bayesian Neural Networks , 2019, ICLR.
[7] Thomas Brox,et al. U-Net: Convolutional Networks for Biomedical Image Segmentation , 2015, MICCAI.
[8] Yen-Cheng Liu,et al. Re-evaluating Continual Learning Scenarios: A Categorization and Case for Strong Baselines , 2018, ArXiv.
[9] Surya Ganguli,et al. Continual Learning Through Synaptic Intelligence , 2017, ICML.
[10] Hanan Samet,et al. Pruning Filters for Efficient ConvNets , 2016, ICLR.
[11] Ronald Kemker,et al. Measuring Catastrophic Forgetting in Neural Networks , 2017, AAAI.
[12] Nassir Navab,et al. QuickNAT: A fully convolutional network for quick and accurate segmentation of neuroanatomy , 2018, NeuroImage.
[13] Razvan Pascanu,et al. Progressive Neural Networks , 2016, ArXiv.
[14] Yoshua Bengio,et al. An Empirical Investigation of Catastrophic Forgeting in Gradient-Based Neural Networks , 2013, ICLR.
[15] David N. Kennedy,et al. CANDIShare: A Resource for Pediatric Neuroimaging Data , 2011, Neuroinformatics.
[16] Jiwon Kim,et al. Continual Learning with Deep Generative Replay , 2017, NIPS.
[17] David Filliat,et al. Don't forget, there is more than forgetting: new metrics for Continual Learning , 2018, ArXiv.
[18] Ender Konukoglu,et al. A Lifelong Learning Approach to Brain MR Segmentation Across Scanners and Protocols , 2018, MICCAI.
[19] Robert E. Mercer,et al. The Task Rehearsal Method of Life-Long Learning: Overcoming Impoverished Data , 2002, Canadian Conference on AI.
[20] Xu Jia,et al. Continual learning: A comparative study on how to defy forgetting in classification tasks , 2019, ArXiv.
[21] Stefan Wermter,et al. Continual Lifelong Learning with Neural Networks: A Review , 2019, Neural Networks.
[22] Nassir Navab,et al. Error Corrective Boosting for Learning Fully Convolutional Networks with Limited Data , 2017, MICCAI.
[23] Marc'Aurelio Ranzato,et al. Gradient Episodic Memory for Continual Learning , 2017, NIPS.
[24] Cordelia Schmid,et al. Incremental Learning of Object Detectors without Catastrophic Forgetting , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).
[25] Geoffrey E. Hinton,et al. ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.
[26] Christoph H. Lampert,et al. iCaRL: Incremental Classifier and Representation Learning , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[27] Svetlana Lazebnik,et al. PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[28] Tinne Tuytelaars,et al. A Continual Learning Survey: Defying Forgetting in Classification Tasks , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[29] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[30] Marcus Rohrbach,et al. Memory Aware Synapses: Learning what (not) to forget , 2017, ECCV.
[31] Yee Whye Teh,et al. Progress & Compress: A scalable framework for continual learning , 2018, ICML.
[32] Yarin Gal,et al. Towards Robust Evaluations of Continual Learning , 2018, ArXiv.