暂无分享,去创建一个
Arthur Szlam | Maximilian Nickel | Ari S. Morcos | Ramakrishna Vedantam | Ari Morcos | Brenden Lake | B. Lake | Arthur D. Szlam | Ramakrishna Vedantam | Maximilian Nickel | Arthur Szlam
[1] Jacob Feldman,et al. Minimization of Boolean complexity in human concept learning , 2000, Nature.
[2] Noah D. Goodman. Learning and the language of thought , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).
[3] A Benchmark for Systematic Generalization in Grounded Language Understanding , 2020, NeurIPS.
[4] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[5] Ross B. Girshick,et al. PHYRE: A New Benchmark for Physical Reasoning , 2019, NeurIPS.
[6] Ramprasaath R. Selvaraju,et al. Counting Everyday Objects in Everyday Scenes , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[7] Li Fei-Fei,et al. CLEVR: A Diagnostic Dataset for Compositional Language and Elementary Visual Reasoning , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[8] Dhruv Batra,et al. Don't Just Assume; Look and Answer: Overcoming Priors for Visual Question Answering , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[9] Noah D. Goodman,et al. The logical primitives of thought: Empirical foundations for compositional cognitive models. , 2016, Psychological review.
[10] L. Barsalou,et al. Ad hoc categories , 1983, Memory & cognition.
[11] G. Murphy,et al. The Big Book of Concepts , 2002 .
[12] Joshua B. Tenenbaum,et al. The Omniglot challenge: a 3-year progress report , 2019, Current Opinion in Behavioral Sciences.
[13] Tong Zhang,et al. Supervised and Semi-Supervised Text Categorization using LSTM for Region Embeddings , 2016, ICML.
[14] Richard S. Zemel,et al. Prototypical Networks for Few-shot Learning , 2017, NIPS.
[15] Tom Griffiths,et al. Learning deep taxonomic priors for concept learning from few positive examples , 2019, CogSci.
[16] Razvan Pascanu,et al. A simple neural network module for relational reasoning , 2017, NIPS.
[17] Brenden M. Lake,et al. Compositional generalization through meta sequence-to-sequence learning , 2019, NeurIPS.
[18] Yoshua Bengio,et al. Understanding the difficulty of training deep feedforward neural networks , 2010, AISTATS.
[19] Pietro Perona,et al. One-shot learning of object categories , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[20] Felix Hill,et al. Learning to Make Analogies by Contrasting Abstract Relational Structure , 2019, ICLR.
[21] Zenon W. Pylyshyn,et al. Connectionism and cognitive architecture: A critical analysis , 1988, Cognition.
[22] Hugo Larochelle,et al. Optimization as a Model for Few-Shot Learning , 2016, ICLR.
[23] Christoph H. Lampert,et al. Attribute-Based Classification for Zero-Shot Visual Object Categorization , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[24] Marco Baroni,et al. Generalization without Systematicity: On the Compositional Skills of Sequence-to-Sequence Recurrent Networks , 2017, ICML.
[25] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[26] Dan Klein,et al. Learning with Latent Language , 2017, NAACL.
[27] Oriol Vinyals,et al. Matching Networks for One Shot Learning , 2016, NIPS.
[28] Yoshua Bengio,et al. CLOSURE: Assessing Systematic Generalization of CLEVR Models , 2019, ViGIL@NeurIPS.
[29] Luc Van Gool,et al. The Pascal Visual Object Classes (VOC) Challenge , 2010, International Journal of Computer Vision.
[30] Hugo Larochelle,et al. Meta-Dataset: A Dataset of Datasets for Learning to Learn from Few Examples , 2019, ICLR.
[31] Noah D. Goodman,et al. Bootstrapping in a language of thought: A formal model of numerical concept learning , 2012, Cognition.
[32] Xiao Wang,et al. Measuring Compositional Generalization: A Comprehensive Method on Realistic Data , 2019, ICLR.
[33] Noah D. Goodman,et al. Concepts in a Probabilistic Language of Thought , 2014 .
[34] Steve Piantadosi,et al. People Infer Recursive Visual Concepts from Just a Few Examples , 2019, Computational Brain & Behavior.
[35] Pietro Liò,et al. Abstract Diagrammatic Reasoning with Multiplex Graph Networks , 2020, ICLR.
[36] Kevin P. Murphy,et al. Machine learning - a probabilistic perspective , 2012, Adaptive computation and machine learning series.
[37] D. McDermott. LANGUAGE OF THOUGHT , 2012 .
[38] Thomas L. Griffiths,et al. A Rational Analysis of Rule-Based Concept Learning , 2008, Cogn. Sci..
[39] Demis Hassabis,et al. SCAN: Learning Abstract Hierarchical Compositional Visual Concepts , 2017, ArXiv.
[40] R. Jacobs,et al. Learning abstract visual concepts via probabilistic program induction in a Language of Thought , 2017, Cognition.
[41] Sergey Levine,et al. Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks , 2017, ICML.
[42] Joshua B. Tenenbaum,et al. A Generative Theory of Similarity , 2005 .
[43] Charles Kemp,et al. Abstraction and Relational learning , 2009, NIPS.
[44] J. Tenenbaum,et al. Word learning as Bayesian inference. , 2007, Psychological review.
[45] John K Kruschke,et al. Bayesian data analysis. , 2010, Wiley interdisciplinary reviews. Cognitive science.
[46] Kevin Murphy,et al. Generative Models of Visually Grounded Imagination , 2017, ICLR.
[47] Felix Hill,et al. Measuring abstract reasoning in neural networks , 2018, ICML.
[48] J. Tenenbaum,et al. Generalization, similarity, and Bayesian inference. , 2001, The Behavioral and brain sciences.