FUNCTION AND DISSIPATION IN FINITE STATE AUTOMATA - FROM COMPUTING TO INTELLIGENCE AND BACK
暂无分享,去创建一个
[1] W. Porod,et al. Energy Limits in Computation A Review of Landauer’s Principle, Theory and Experiments , 2019 .
[2] Geoffrey E. Hinton,et al. The Helmholtz Machine , 1995, Neural Computation.
[3] Charles H. Bennett,et al. Notes on Landauer's Principle, Reversible Computation, and Maxwell's Demon , 2002, physics/0210005.
[4] Nicholas Pippenger. Reliable Computation in the Presence of Noise , 1986 .
[5] Karoline Wiesner,et al. Information-theoretic lower bound on energy cost of stochastic computation , 2011, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences.
[6] N. G. Anderson,et al. Toward Nanoprocessor Thermodynamics , 2012, IEEE Transactions on Nanotechnology.
[7] Wei Yang Lu,et al. Nanoscale memristor device as synapse in neuromorphic systems. , 2010, Nano letters.
[8] Gary Marcus,et al. Deep Learning: A Critical Appraisal , 2018, ArXiv.
[9] Saurabh K. Bose,et al. Stable Self-Assembled Atomic-Switch Networks for Neuromorphic Applications , 2017, IEEE Transactions on Electron Devices.
[10] Neal G. Anderson. Irreversible information loss: Fundamental notions and entropy costs , 2014 .
[11] Robert Marsland,et al. Statistical Physics of Adaptation , 2014, 1412.1875.
[12] John Langford,et al. Efficient Exploration in Reinforcement Learning , 2017, Encyclopedia of Machine Learning and Data Mining.
[13] Peng Lin,et al. Fully memristive neural networks for pattern classification with unsupervised learning , 2018 .
[14] Neal G. Anderson,et al. Heat Dissipation in Nanocomputing: Lower Bounds From Physical Information Theory , 2013, IEEE Transactions on Nanotechnology.
[15] R. Keyes. Physical limits of silicon transistors and circuits , 2005 .
[16] Robert L. Fry,et al. Physical Intelligence and Thermodynamic Computing , 2017, Entropy.
[17] Matthew D. Zeiler. ADADELTA: An Adaptive Learning Rate Method , 2012, ArXiv.
[18] J. Anders,et al. Quantum thermodynamics , 2015, 1508.06099.
[19] Nuttapong Chentanez,et al. Intrinsically Motivated Reinforcement Learning , 2004, NIPS.
[20] Schumacher,et al. Limitation on the amount of accessible information in a quantum channel. , 1996, Physical review letters.
[21] Suriyanarayanan Vaikuntanathan,et al. Design principles for nonequilibrium self-assembly , 2015, Proceedings of the National Academy of Sciences.
[22] Natesh Ganesh,et al. Dissipation in neuromorphic computing: Fundamental bounds for feedforward networks , 2017, 2017 IEEE 17th International Conference on Nanotechnology (IEEE-NANO).
[23] A. Holevo. Bounds for the quantity of information transmitted by a quantum communication channel , 1973 .
[24] Susanne Still. Thermodynamic cost and benefit of data representations , 2017 .
[25] Erik DeBenedictis,et al. A path toward ultra-low-energy computing , 2016, 2016 IEEE International Conference on Rebooting Computing (ICRC).
[26] Sandy Lovie. How the mind works , 1980, Nature.
[27] Eörs Szathmáry,et al. How Can Evolution Learn? , 2016, Trends in ecology & evolution.
[28] Susanne Still,et al. The thermodynamics of prediction , 2012, Physical review letters.
[29] Razvan Pascanu,et al. Overcoming catastrophic forgetting in neural networks , 2016, Proceedings of the National Academy of Sciences.
[30] Kaushik Roy,et al. Exploring Spin Transfer Torque Devices for Unconventional Computing , 2015, IEEE Journal on Emerging and Selected Topics in Circuits and Systems.
[31] Anders Krogh,et al. Introduction to the theory of neural computation , 1994, The advanced book program.
[32] Karl J. Friston,et al. A Free Energy Principle for Biological Systems. , 2012, Entropy.
[33] Natesh Ganesh,et al. Thermodynamic Intelligence, a Heretical Theory , 2018, 2018 IEEE International Conference on Rebooting Computing (ICRC).
[34] R. Landauer,et al. Irreversibility and heat generation in the computing process , 1961, IBM J. Res. Dev..
[35] John C. Baez,et al. Relative Entropy in Biological Systems , 2015, Entropy.
[36] Neal G. Anderson,et al. Information Erasure in Quantum Systems , 2008 .
[37] Natesh Ganesh,et al. A Thermodynamic Treatment of Intelligent Systems , 2017, 2017 IEEE International Conference on Rebooting Computing (ICRC).
[38] Giacomo Indiveri,et al. Integration of nanoscale memristor synapses in neuromorphic computing architectures , 2013, Nanotechnology.
[39] Subutai Ahmad,et al. Properties of Sparse Distributed Representations and their Application to Hierarchical Temporal Memory , 2015, ArXiv.
[40] Pedro M. Domingos. A few useful things to know about machine learning , 2012, Commun. ACM.
[41] Christof Teuscher,et al. The Weird, the Small, and the Uncontrollable: Redefining the Frontiers of Computing , 2017, Computer.
[42] Chris Arney. Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World , 2014 .
[43] Anil K. Jain,et al. Artificial Neural Networks: A Tutorial , 1996, Computer.
[44] Mark D. Hill,et al. Amdahl's Law in the Multicore Era , 2008, Computer.
[45] Shu-Kun Lin,et al. Modern Thermodynamics: From Heat Engines to Dissipative Structures , 1999, Entropy.
[46] Michael McCloskey,et al. Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem , 1989 .
[47] Tibor Nemetz. INFORMATION THEORY AND COMMUNICATION , 2011 .
[48] Guillem Collell,et al. Brain activity and cognition: a connection from thermodynamics and information theory , 2015, Front. Psychol..
[49] James Ladyman,et al. What does it mean to say that a physical system implements a computation? , 2009, Theor. Comput. Sci..
[50] Sterling Street. Neurobiology as Information Physics , 2016, bioRxiv.
[51] G. Crooks. Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences. , 1999, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.
[52] J J Hopfield,et al. Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.
[53] Neal G. Anderson,et al. On the physical implementation of logical transformations: Generalized L-machines , 2010, Theor. Comput. Sci..
[54] Adam Z. Stieg,et al. Neuromorphic Atomic Switch Networks , 2012, PloS one.
[55] Neal G. Anderson,et al. Overwriting information: Correlations, physical costs, and environment models , 2012 .
[56] Natesh Ganesh,et al. Irreversibility and dissipation in finite-state automata , 2013 .
[57] Karl J. Friston,et al. Predictive coding under the free-energy principle , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.
[58] Jean-Éric Pin,et al. On Reversible Automata , 1992, LATIN.
[59] Jeremy L. England. Dissipative adaptation in driven self-assembly. , 2015, Nature nanotechnology.
[60] Susanne Still,et al. Information-theoretic approach to interactive learning , 2007, 0709.1948.
[61] James Ladyman,et al. The connection between logical and thermodynamic irreversibility , 2007 .
[62] Thierry Paul,et al. Quantum computation and quantum information , 2007, Mathematical Structures in Computer Science.
[63] Naftali Tishby,et al. Document clustering using word clusters via the information bottleneck method , 2000, SIGIR '00.
[64] H. Noji,et al. A rotary molecular motor that can work at near 100% efficiency. , 2000, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.
[65] Frederick C. Hennie,et al. Finite-state Models for Logical Machines , 1968 .
[66] Mihaela Ulieru,et al. Emergent engineering: a radical paradigm shift , 2011, Int. J. Auton. Adapt. Commun. Syst..
[67] Amos Storkey,et al. Hopfield learning rule with high capacity storage of time-correlated patterns , 1997 .
[68] Masakazu Aono,et al. Atomic switch networks as complex adaptive systems , 2018 .
[69] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[70] Naftali Tishby,et al. The information bottleneck method , 2000, ArXiv.
[71] E. M.,et al. Statistical Mechanics , 2021, Manual for Theoretical Chemistry.
[72] Richard S. Sutton,et al. Reinforcement Learning: An Introduction , 1998, IEEE Trans. Neural Networks.
[73] Geoffrey E. Hinton,et al. Autoencoders, Minimum Description Length and Helmholtz Free Energy , 1993, NIPS.
[74] Naftali Tishby,et al. Past-future information bottleneck in dynamical systems. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.
[75] Madeline A. Lancaster,et al. Cerebral organoids model human brain development and microcephaly , 2013, Nature.
[76] Philippe Matherat,et al. Logical Dissipation of Automata Implements - Dissipation of Computation , 1998 .
[77] Perry L. Miller,et al. The Human Brain Project: neuroinformatics tools for integrating, searching and modeling multidisciplinary neuroscience data , 1998, Trends in Neurosciences.
[78] Jeremy L. England,et al. Statistical physics of self-replication. , 2012, The Journal of chemical physics.
[79] David Wentzlaff,et al. The Accelerator Wall: Limits of Chip Specialization , 2019, 2019 IEEE International Symposium on High Performance Computer Architecture (HPCA).
[80] A. Wehrl. General properties of entropy , 1978 .
[81] John R. Searle,et al. Minds, brains, and programs , 1980, Behavioral and Brain Sciences.
[82] Thomas N. Theis,et al. The End of Moore's Law: A New Beginning for Information Technology , 2017, Computing in Science & Engineering.
[83] Richard E. Blahut,et al. Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.
[84] Neal G. Anderson,et al. Conditioning, Correlation and Entropy Generation in Maxwell's Demon , 2013, Entropy.
[85] Rajesh P. N. Rao,et al. Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects. , 1999 .
[86] M. Lowe. Consciousness and Language , 2004 .
[87] Yi Tao,et al. The replicator equation and other game dynamics , 2014, Proceedings of the National Academy of Sciences.
[88] G.E. Moore,et al. Cramming More Components Onto Integrated Circuits , 1998, Proceedings of the IEEE.
[89] Volker J. Sorger,et al. Towards On-Chip Optical FFTs for Convolutional Neural Networks , 2017, 2017 IEEE International Conference on Rebooting Computing (ICRC).
[90] Susanne Still,et al. Information Bottleneck Approach to Predictive Inference , 2014, Entropy.
[91] Byoung-Tak Zhang,et al. Information-Theoretic Objective Functions for Lifelong Learning , 2013, AAAI Spring Symposium: Lifelong Machine Learning.
[92] Naftali Tishby,et al. Deep learning and the information bottleneck principle , 2015, 2015 IEEE Information Theory Workshop (ITW).
[93] O. Cueto,et al. Physical aspects of low power synapses based on phase change memory devices , 2012 .
[94] Karl J. Friston. The free-energy principle: a rough guide to the brain? , 2009, Trends in Cognitive Sciences.
[95] R.H. Dennard,et al. Design Of Ion-implanted MOSFET's with Very Small Physical Dimensions , 1974, Proceedings of the IEEE.
[96] John E. Hershey,et al. Computation , 1991, Digit. Signal Process..
[97] S. Nelson,et al. Hebb and homeostasis in neuronal plasticity , 2000, Current Opinion in Neurobiology.
[98] W. Ashby,et al. Every Good Regulator of a System Must Be a Model of That System , 1970 .
[99] A. M. Turing,et al. Computing Machinery and Intelligence , 1950, The Philosophy of Artificial Intelligence.
[100] Hong Wang,et al. Loihi: A Neuromorphic Manycore Processor with On-Chip Learning , 2018, IEEE Micro.
[101] Marc Harper,et al. The Replicator Equation as an Inference Dynamic , 2009, ArXiv.
[102] Neal G. Anderson,et al. Information as a physical quantity , 2017, Inf. Sci..
[103] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .