Dynamic compression and expansion in a classifying recurrent network
暂无分享,去创建一个
Eric Shea-Brown | Matthew Farrell | Stefano Recanatesi | Guillaume Lajoie | Eric T. Shea-Brown | Guillaume Lajoie | Stefano Recanatesi | M. Farrell | E. Shea-Brown
[1] Adrienne L. Fairhall,et al. Dimensionality reduction in neuroscience , 2016, Current Biology.
[2] Robert A. Lordo,et al. Learning from Data: Concepts, Theory, and Methods , 2001, Technometrics.
[3] Stefano Fusi,et al. Why neurons mix: high dimensionality for higher cognition , 2016, Current Opinion in Neurobiology.
[4] Kenneth D. Harris,et al. High-dimensional geometry of population responses in visual cortex , 2019, Nat..
[5] Herbert Jaeger,et al. The''echo state''approach to analysing and training recurrent neural networks , 2001 .
[6] Zachary Chase Lipton. A Critical Review of Recurrent Neural Networks for Sequence Learning , 2015, ArXiv.
[7] Omri Barak,et al. Recurrent neural networks as versatile tools of neuroscience research , 2017, Current Opinion in Neurobiology.
[8] Thomas M. Cover,et al. Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern Recognition , 1965, IEEE Trans. Electron. Comput..
[9] Wei Ji Ma,et al. A diverse range of factors affect the nature of neural representations underlying short-term memory , 2018, Nature Neuroscience.
[10] H. Sompolinsky,et al. Coherent chaos in a recurrent neural network with structured connectivity , 2018, bioRxiv.
[11] Francesca Mastrogiuseppe,et al. Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks , 2017, Neuron.
[12] M. A. Smith,et al. The spatial structure of correlated neuronal variability , 2016, Nature Neuroscience.
[13] Claudia Clopath,et al. Sparse synaptic connectivity is required for decorrelation and pattern separation in feedforward networks , 2017, Nature Communications.
[14] Geoffrey E. Hinton,et al. Reducing the Dimensionality of Data with Neural Networks , 2006, Science.
[15] Eric Shea-Brown,et al. Dimensionality compression and expansion in Deep Neural Networks , 2019, ArXiv.
[16] R. J. Bell,et al. Atomic vibrations in vitreous silica , 1970 .
[17] Yoshua Bengio,et al. Gradient-based learning applied to document recognition , 1998, Proc. IEEE.
[18] J. Albus. A Theory of Cerebellar Function , 1971 .
[19] Razvan Pascanu,et al. Vector-based navigation using grid-like representations in artificial agents , 2018, Nature.
[20] Stefano Fusi,et al. Low-dimensional dynamics for working memory and time encoding , 2020, Proceedings of the National Academy of Sciences.
[21] Dean V. Buonomano,et al. ROBUST TIMING AND MOTOR PATTERNS BY TAMING CHAOS IN RECURRENT NEURAL NETWORKS , 2012, Nature Neuroscience.
[22] Rainer Engelken,et al. Dynamical models of cortical circuits , 2014, Current Opinion in Neurobiology.
[23] T. Poggio,et al. Hierarchical models of object recognition in cortex , 1999, Nature Neuroscience.
[24] Eric Shea-Brown,et al. Encoding in Balanced Networks: Revisiting Spike Patterns and Chaos in Stimulus-Driven Systems , 2016, PLoS Comput. Biol..
[25] Michael N. Shadlen,et al. Low dimensional dynamics for working memory and time encoding , 2019 .
[26] C. Stam,et al. Nonlinear dynamical analysis of EEG and MEG: Review of an emerging field , 2005, Clinical Neurophysiology.
[27] Geoffrey E. Hinton,et al. Deep Learning , 2015, Nature.
[28] Naftali Tishby,et al. Opening the Black Box of Deep Neural Networks via Information , 2017, ArXiv.
[29] Yuanzhi Li,et al. Learning Overparameterized Neural Networks via Stochastic Gradient Descent on Structured Data , 2018, NeurIPS.
[30] L. F. Abbott,et al. Generating Coherent Patterns of Activity from Chaotic Neural Networks , 2009, Neuron.
[31] Anthony Widjaja,et al. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2003, IEEE Transactions on Neural Networks.
[32] F. Wolf,et al. Dynamical entropy production in spiking neuron networks in the balanced state. , 2010, Physical review letters.
[33] L. Abbott,et al. Stimulus-dependent suppression of chaos in recurrent neural networks. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.
[34] D. Ruelle,et al. Ergodic theory of chaos and strange attractors , 1985 .
[35] Andrew M. Saxe,et al. High-dimensional dynamics of generalization error in neural networks , 2017, Neural Networks.
[36] P. Grassberger,et al. Measuring the Strangeness of Strange Attractors , 1983 .
[37] Eric Shea-Brown,et al. Chaos and reliability in balanced spiking networks with temporal drive. , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.
[38] G. La Camera,et al. Dynamics of Multistable States during Ongoing and Evoked Cortical Activity , 2015, The Journal of Neuroscience.
[39] Rainer Engelken,et al. Dimensionality and entropy of spontaneous and evoked rate activity , 2017 .
[40] Alessandro Laio,et al. Intrinsic dimension of data representations in deep neural networks , 2019, NeurIPS.
[41] Pascal Vincent,et al. Representation Learning: A Review and New Perspectives , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[42] Sommers,et al. Chaos in random neural networks. , 1988, Physical review letters.
[43] Haim Sompolinsky,et al. Inferring Stimulus Selectivity from the Spatial Structure of Neural Network Dynamics , 2010, NIPS.
[44] Daniel D. Lee,et al. Classification and Geometry of General Perceptual Manifolds , 2017, Physical Review X.
[45] Alexander J. Smola,et al. Learning with Kernels: support vector machines, regularization, optimization, and beyond , 2001, Adaptive computation and machine learning series.
[46] Haim Sompolinsky,et al. Coherent chaos in a recurrent neural network with structured connectivity , 2018, PLoS Comput. Biol..
[47] Haim Sompolinsky,et al. Separability and geometry of object manifolds in deep neural networks , 2019, Nature Communications.
[48] Henry Markram,et al. Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.
[49] Jonathan Kadmon,et al. Optimal Architectures in a Solvable Model of Deep Networks , 2016, NIPS.
[50] H. Sompolinsky,et al. Sparseness and Expansion in Sensory Representations , 2014, Neuron.
[51] Surya Ganguli,et al. A theory of multineuronal dimensionality, dynamics and measurement , 2017, bioRxiv.
[52] Robert A. Legenstein,et al. 2007 Special Issue: Edge of chaos and prediction of computational performance for neural circuit models , 2007 .
[53] Haim Sompolinsky,et al. Chaotic Balanced State in a Model of Cortical Circuits , 1998, Neural Computation.
[54] Haiping Huang,et al. Mechanisms of dimensionality reduction and decorrelation in deep neural networks , 2018, Physical Review E.
[55] M. London,et al. Sensitivity to perturbations in vivo implies high noise and suggests rate coding in cortex , 2010, Nature.
[56] Devika Narain,et al. Flexible timing by temporal scaling of cortical responses , 2017, Nature Neuroscience.
[57] Xiao-Jing Wang,et al. The importance of mixed selectivity in complex cognitive tasks , 2013, Nature.
[58] Jorge Nocedal,et al. On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima , 2016, ICLR.
[59] W. Newsome,et al. Context-dependent computation by recurrent dynamics in prefrontal cortex , 2013, Nature.
[60] Vladimir Cherkassky,et al. Learning from Data: Concepts, Theory, and Methods , 1998 .
[61] Haim Sompolinsky,et al. Optimal Degrees of Synaptic Connectivity , 2017, Neuron.
[62] Haim Sompolinsky,et al. Linear readout of object manifolds. , 2015, Physical review. E.
[63] D. Marr. A theory of cerebellar cortex , 1969, The Journal of physiology.
[64] Vladimir N. Vapnik,et al. The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.
[65] Vladimir Vapnik,et al. Statistical learning theory , 1998 .
[66] Alex Graves,et al. Supervised Sequence Labelling with Recurrent Neural Networks , 2012, Studies in Computational Intelligence.
[67] Ulrich Parlitz,et al. Comparison of Different Methods for Computing Lyapunov Exponents , 1990 .
[68] A. Litwin-Kumar,et al. Slow dynamics and high variability in balanced cortical networks with clustered connections , 2012, Nature Neuroscience.
[69] Byron M. Yu,et al. Dimensionality reduction for large-scale neural recordings , 2014, Nature Neuroscience.
[70] G. La Camera,et al. Stimuli Reduce the Dimensionality of Cortical Activity , 2015, bioRxiv.
[71] L. F. Abbott,et al. full-FORCE: A target-based method for training recurrent networks , 2017, PloS one.
[72] Brent Doiron,et al. Circuit Models of Low-Dimensional Shared Variability in Cortical Networks , 2019, Neuron.