Incremental Learning with Unlabeled Data in the Wild
暂无分享,去创建一个
Kibok Lee | Jinwoo Shin | Honglak Lee | Kimin Lee | Jinwoo Shin | Honglak Lee | Kibok Lee | Kimin Lee
[1] Cordelia Schmid,et al. End-to-End Incremental Learning , 2018, ECCV.
[2] Junmo Kim,et al. Less-forgetful Learning for Domain Expansion in Deep Neural Networks , 2017, AAAI.
[3] Kibok Lee,et al. Training Confidence-calibrated Classifiers for Detecting Out-of-Distribution Samples , 2017, ICLR.
[4] Fei-Fei Li,et al. ImageNet: A large-scale hierarchical image database , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.
[5] R. French. Catastrophic Forgetting in Connectionist Networks , 2006 .
[6] Alexandros Karatzoglou,et al. Overcoming Catastrophic Forgetting with Hard Attention to the Task , 2018 .
[7] Demis Hassabis,et al. Mastering the game of Go without human knowledge , 2017, Nature.
[8] Philip H. S. Torr,et al. Riemannian Walk for Incremental Learning: Understanding Forgetting and Intransigence , 2018, ECCV.
[9] Yandong Guo,et al. Incremental Classifier Learning with Generative Adversarial Networks , 2018, ArXiv.
[10] Frank Hutter,et al. A Downsampled Variant of ImageNet as an Alternative to the CIFAR datasets , 2017, ArXiv.
[11] Kaiming He,et al. Exploring the Limits of Weakly Supervised Pretraining , 2018, ECCV.
[12] Razvan Pascanu,et al. Progressive Neural Networks , 2016, ArXiv.
[13] Antonio Torralba,et al. Ieee Transactions on Pattern Analysis and Machine Intelligence 1 80 Million Tiny Images: a Large Dataset for Non-parametric Object and Scene Recognition , 2022 .
[14] Yuichi Yoshida,et al. Spectral Normalization for Generative Adversarial Networks , 2018, ICLR.
[15] P. Cochat,et al. Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.
[16] Andreas S. Tolias,et al. Generative replay with feedback connections as a general strategy for continual learning , 2018, ArXiv.
[17] Marcus Rohrbach,et al. Memory Aware Synapses: Learning what (not) to forget , 2017, ECCV.
[18] Jonathan Krause,et al. The Unreasonable Effectiveness of Noisy Data for Fine-Grained Recognition , 2015, ECCV.
[19] Yee Whye Teh,et al. Progress & Compress: A scalable framework for continual learning , 2018, ICML.
[20] Christoph H. Lampert,et al. iCaRL: Incremental Classifier and Representation Learning , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[21] Derek Hoiem,et al. Learning without Forgetting , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[22] Richard E. Turner,et al. Variational Continual Learning , 2017, ICLR.
[23] David Filliat,et al. Generative Models from the perspective of Continual Learning , 2018, 2019 International Joint Conference on Neural Networks (IJCNN).
[24] Byoung-Tak Zhang,et al. Overcoming Catastrophic Forgetting by Incremental Moment Matching , 2017, NIPS.
[25] Wei Chen,et al. SupportNet: solving catastrophic forgetting in class incremental learning with support data , 2018, ArXiv.
[26] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[27] Razvan Pascanu,et al. Overcoming catastrophic forgetting in neural networks , 2016, Proceedings of the National Academy of Sciences.
[28] Michael McCloskey,et al. Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem , 1989 .
[29] Surya Ganguli,et al. Continual Learning Through Synaptic Intelligence , 2017, ICML.
[30] Dahua Lin,et al. Lifelong Learning via Progressive Distillation and Retrospection , 2018, ECCV.
[31] Nikos Komodakis,et al. Wide Residual Networks , 2016, BMVC.
[32] David Barber,et al. Online Structured Laplace Approximations For Overcoming Catastrophic Forgetting , 2018, NeurIPS.
[33] Kilian Q. Weinberger,et al. On Calibration of Modern Neural Networks , 2017, ICML.
[34] Kibok Lee,et al. Hierarchical Novelty Detection for Visual Object Recognition , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[35] Marc'Aurelio Ranzato,et al. Gradient Episodic Memory for Continual Learning , 2017, NIPS.
[36] Danna Zhou,et al. d. , 1934, Microbial pathogenesis.
[37] Tsuyoshi Murata,et al. {m , 1934, ACML.
[38] Jiwon Kim,et al. Continual Learning with Deep Generative Replay , 2017, NIPS.
[39] Rajat Raina,et al. Self-taught learning: transfer learning from unlabeled data , 2007, ICML '07.
[40] Kaiming He,et al. Mask R-CNN , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).
[41] Geoffrey E. Hinton,et al. Distilling the Knowledge in a Neural Network , 2015, ArXiv.
[42] Hyo-Eun Kim,et al. Keep and Learn: Continual Learning by Constraining the Latent Space for Knowledge Preservation in Neural Networks , 2018, MICCAI.
[43] Alex Krizhevsky,et al. Learning Multiple Layers of Features from Tiny Images , 2009 .
[44] Svetlana Lazebnik,et al. Piggyback: Adapting a Single Network to Multiple Tasks by Learning to Mask Weights , 2018, ECCV.