Photonic online learning: a perspective

Abstract Emerging neuromorphic hardware promises to solve certain problems faster and with higher energy efficiency than traditional computing by using physical processes that take place at the device level as the computational primitives in neural networks. While initial results in photonic neuromorphic hardware are very promising, such hardware requires programming or “training” that is often power-hungry and time-consuming. In this article, we examine the online learning paradigm, where the machinery for training is built deeply into the hardware itself. We argue that some form of online learning will be necessary if photonic neuromorphic hardware is to achieve its true potential.

[1]  M. Stern,et al.  Learning Without Neurons in Physical Systems , 2022, Annual Review of Condensed Matter Physics.

[2]  Ian A. D. Williamson,et al.  Experimentally realized in situ backpropagation for deep learning in photonic neural networks , 2022, Science.

[3]  Niyazi Ulaş Dinç,et al.  Programming Nonlinear Propagation for Efficient Optical Learning Machines , 2022, Advanced Photonics.

[4]  N. Harris,et al.  Single chip photonic deep neural network with accelerated training , 2022, ArXiv.

[5]  Clemens J. S. Schaefer,et al.  A compute-in-memory chip based on resistive random-access memory , 2022, Nature.

[6]  A. McCaughan,et al.  A general approach to fast online training of modern datasets on real neuromorphic systems without backpropagation , 2022, ICONS.

[7]  Xingzhao Liu,et al.  Training optronic convolutional neural networks on an optical system through backpropagation algorithms. , 2022, Optics express.

[8]  Huaqiang Wu,et al.  Memristor-based analogue computing for brain-inspired sound localization with in situ training , 2022, Nature Communications.

[9]  K. Nakajima,et al.  Physical Deep Learning with Biologically Plausible Training Method , 2022, ArXiv.

[10]  J. Spall,et al.  Hybrid training of optical neural networks , 2022, Optica.

[11]  J. Shainline,et al.  SuperMind: a survey of the potential of superconducting electronics for neuromorphic computing , 2022, Superconductor Science and Technology.

[12]  Krishnendu Chakrabarty,et al.  CHAMP: Coherent Hardware-Aware Magnitude Pruning of Integrated Photonic Neural Networks , 2021, 2022 Optical Fiber Communications Conference and Exhibition (OFC).

[13]  Francesco Da Ros,et al.  Comparison of Models for Training Optical Matrix Multipliers in Neuromorphic PICs , 2021, OFC.

[14]  P. Prucnal,et al.  Monolithic Silicon Photonic Architecture for Training Deep Neural Networks with Direct Feedback Alignment , 2021, Optica.

[15]  Andrea J. Liu,et al.  Demonstration of Decentralized Physics-Driven Learning , 2021, Physical Review Applied.

[16]  Firooz Aflatouni,et al.  An on-chip photonic deep neural network for image classification , 2021, Nature.

[17]  Adnan Mehonic,et al.  Brain-inspired computing needs a master plan , 2021, Nature.

[18]  P. Prucnal,et al.  Silicon microring synapses enable photonic deep learning beyond 9-bit precision , 2021, Optica.

[19]  Supriyo Datta,et al.  Hardware-Aware In Situ Learning Based on Stochastic Magnetic Tunnel Junctions , 2021, Physical Review Applied.

[20]  Johannes Schemmel,et al.  Surrogate gradients for analog neuromorphic computing , 2020, Proceedings of the National Academy of Sciences.

[21]  David Z. Pan,et al.  L2ight: Enabling On-Chip Learning for Optical Neural Networks via Efficient in-situ Subspace Optimization , 2021, NeurIPS.

[22]  F. Yaman,et al.  A silicon photonic–electronic neural network for fibre nonlinearity compensation , 2021, Nature Electronics.

[23]  K. Kawarabayashi,et al.  100,000-spin coherent Ising machine , 2021, Science advances.

[24]  Advait Madhavan,et al.  A System for Validating Resistive Neural Network Prototypes , 2021, ICONS.

[25]  D. Englund,et al.  High-speed programmable photonic circuits in a cryogenically compatible, visible–near-infrared 200 mm CMOS architecture , 2021, Nature Photonics.

[26]  J. Heck,et al.  Post-Fabrication Trimming of Silicon Photonic Ring Resonators at Wafer-Scale , 2021, Journal of Lightwave Technology.

[27]  Logan G. Wright,et al.  Deep physical neural networks trained with backpropagation , 2021, Nature.

[28]  T. Runkler,et al.  TinyOL: TinyML with Online-Learning on Microcontrollers , 2021, 2021 International Joint Conference on Neural Networks (IJCNN).

[29]  D. Englund,et al.  Hardware error correction for programmable photonics , 2021, ArXiv.

[30]  E. Vianello,et al.  In situ learning using intrinsic memristor variability via Markov chain Monte Carlo sampling , 2021 .

[31]  A. Majumdar,et al.  Non‐Volatile Reconfigurable Integrated Photonics Enabled by Broadband Low‐Loss Phase Change Material , 2021, Advanced Optical Materials.

[32]  Qionghai Dai,et al.  Large-scale neuromorphic optoelectronic computing with a reconfigurable diffractive processing unit , 2020, Nature Photonics.

[33]  Dirk Englund,et al.  Freely scalable and reconfigurable optical hardware for deep learning , 2020, Scientific Reports.

[34]  Xuan Li,et al.  Parallel convolutional processing using an integrated photonic tensor core , 2021, Nature.

[35]  W. Haensch,et al.  Unassisted True Analog Neural Network Training Chip , 2020, 2020 IEEE International Electron Devices Meeting (IEDM).

[36]  Florent Krzakala,et al.  Hardware Beyond Backpropagation: a Photonic Co-Processor for Direct Feedback Alignment , 2020, ArXiv.

[37]  Bhavin J. Shastri,et al.  Photonics for artificial intelligence and neuromorphic computing , 2020, Nature Photonics.

[38]  Kaushik Roy,et al.  Roadmap on emerging hardware and technology for machine learning , 2020, Nanotechnology.

[39]  Tao Yan,et al.  In situ optical backpropagation training of diffractive optical neural networks , 2020 .

[40]  A. Hurtado,et al.  Ultrafast optical integration and pattern classification for neuromorphic photonics based on spiking VCSEL neurons , 2020, Scientific Reports.

[41]  Bhavin J. Shastri,et al.  Demonstration of scalable microring weight bank control for large-scale photonic integrated circuits , 2020 .

[42]  St'ephane Chr'etien,et al.  Boolean learning under noise-perturbations in hardware neural networks , 2020, Nanophotonics.

[43]  Damien Querlioz,et al.  Physics for neuromorphic computing , 2020, Nature Reviews Physics.

[44]  A. Adibi,et al.  Tunable nanophotonics enabled by chalcogenide phase-change materials , 2020, 2001.06335.

[45]  Kyunghyun Cho,et al.  A Unified Framework of Online Learning Algorithms for Training Recurrent Neural Networks , 2019, J. Mach. Learn. Res..

[46]  Kelvin H. Wagner,et al.  Optical Rectifying Linear Units for Back-Propagation Learning in a Deep Holographic Convolutional Neural Network , 2020, IEEE Journal of Selected Topics in Quantum Electronics.

[47]  A. Lvovsky,et al.  Backpropagation through nonlinear units for the all-optical training of neural networks , 2019, Photonics Research.

[48]  M. Burla,et al.  Nano–opto-electro-mechanical switches operated at CMOS-level voltages , 2019, Science.

[49]  Damien Querlioz,et al.  Digital Biologically Plausible Implementation of Binarized Neural Networks With Differential Hafnium Oxide Resistive Memory Arrays , 2019, Frontiers in Neuroscience.

[50]  C. Wright,et al.  All-optical spiking neurosynaptic networks with self-learning capabilities , 2019, Nature.

[51]  Tomaso A. Poggio,et al.  Biologically-plausible learning algorithms can scale to large datasets , 2018, ICLR.

[52]  Shanhui Fan,et al.  Training of Photonic Neural Networks through In Situ Backpropagation , 2018, 2019 Conference on Lasers and Electro-Optics (CLEO).

[53]  Chunhua Wang,et al.  Memristor-based neural networks with weight simultaneous perturbation training , 2019, Nonlinear Dynamics.

[54]  Catherine D. Schuman,et al.  Design of Superconducting Optoelectronic Networks for Neuromorphic Computing , 2018, 2018 IEEE International Conference on Rebooting Computing (ICRC).

[55]  Geoffrey E. Hinton,et al.  Assessing the Scalability of Biologically-Motivated Deep Learning Algorithms and Architectures , 2018, NeurIPS.

[56]  Nathan C. P. Farinha,et al.  Equivalent-accuracy accelerated neural-network training using analogue memory , 2018, Nature.

[57]  Bhavin J. Shastri,et al.  Neuromorphic Photonic Integrated Circuits , 2018, IEEE Journal of Selected Topics in Quantum Electronics.

[58]  Sae Woo Nam,et al.  Design, fabrication, and metrology of 10 × 100 multi-planar integrated photonic routing manifolds for neural networks , 2018, APL Photonics.

[59]  Sae Woo Nam,et al.  Circuit designs for superconducting optoelectronic loop neurons , 2018, Journal of Applied Physics.

[60]  Yi Luo,et al.  All-optical machine learning using diffractive deep neural networks , 2018, Science.

[61]  Arindam Basu,et al.  Low-Power, Adaptive Neuromorphic Systems: Recent Progress and Future Directions , 2018, IEEE Journal on Emerging and Selected Topics in Circuits and Systems.

[62]  Catherine E. Graves,et al.  Memristor‐Based Analog Computation and Neural Network Classification with a Dot Product Engine , 2018, Advanced materials.

[63]  Hong Wang,et al.  Loihi: A Neuromorphic Manycore Processor with On-Chip Learning , 2018, IEEE Micro.

[64]  Laurent Larger,et al.  Reinforcement Learning in a large scale photonic Recurrent Neural Network , 2017, Optica.

[65]  P. Prucnal,et al.  Neuromorphic photonic networks using silicon photonic weight banks , 2017, Scientific Reports.

[66]  Catherine D. Schuman,et al.  A Survey of Neuromorphic Computing and Neural Networks in Hardware , 2017, ArXiv.

[67]  M. C. Soriano,et al.  Advances in photonic reservoir computing , 2017 .

[68]  Laurent Larger,et al.  High-Speed Photonic Reservoir Computing Using a Time-Delay-Based Architecture: Million Words per Second Classification , 2017 .

[69]  Dirk Englund,et al.  Deep learning with coherent nanophotonic circuits , 2017, 2017 Fifth Berkeley Symposium on Energy Efficient Electronic Systems & Steep Transistors Workshop (E3S).

[70]  Yoshua Bengio,et al.  Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation , 2016, Front. Comput. Neurosci..

[71]  P. Prucnal,et al.  NEUROMORPHIC PHOTONICS , 2017 .

[72]  Colin J. Akerman,et al.  Random synaptic feedback weights support error backpropagation for deep learning , 2016, Nature Communications.

[73]  Kazuyuki Aihara,et al.  A fully programmable 100-spin coherent Ising machine with all-to-all connections , 2016, Science.

[74]  Arild Nøkland,et al.  Direct Feedback Alignment Provides Learning in Deep Neural Networks , 2016, NIPS.

[75]  Paul R. Prucnal,et al.  Spike processing with a graphene excitable laser , 2016, Scientific Reports.

[76]  Joel Z. Leibo,et al.  How Important Is Weight Symmetry in Backpropagation? , 2015, AAAI.

[77]  K. Kawarabayashi,et al.  A Coherent Ising Machine for MAX-CUT Problems: Performance Evaluation against Semidefinite Programming and Simulated Annealing , 2015, 1501.07030.

[78]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[79]  Joni Dambre,et al.  Trainable hardware for dynamical computing using error backpropagation through physical media , 2014, Nature Communications.

[80]  R. Byer,et al.  Network of time-multiplexed optical parametric oscillators as a coherent Ising machine , 2014, Nature Photonics.

[81]  Geert Morthier,et al.  Experimental demonstration of reservoir computing on a silicon photonics chip , 2014, Nature Communications.

[82]  Daniel Brunner,et al.  Parallel photonic information processing at gigabyte per second data rates using transient states , 2013, Nature Communications.

[83]  Myron Flickner,et al.  Compass: A scalable simulator for an architecture for cognitive computing , 2012, 2012 International Conference for High Performance Computing, Networking, Storage and Analysis.

[84]  Terrence J. Sejnowski,et al.  New Directions in Statistical Signal Processing: From Systems to Brains (Neural Information Processing) , 2006 .

[85]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[86]  Gert Cauwenberghs,et al.  Analog VLSI Stochastic Perturbative Learning Architectures , 1997 .

[87]  D Psaltis,et al.  Optical network for real-time face recognition. , 1993, Applied optics.

[88]  Patrick Garda,et al.  Optoelectronic devices for Boltzmann machines and simulated annealing , 1993 .

[89]  Kurt W. Fleischer,et al.  Analog VLSI Implementation of Multi-dimensional Gradient Descent , 1992, NIPS 1992.

[90]  Ron Meir,et al.  A Parallel Gradient Descent Method for Learning in Analog VLSI Neural Networks , 1992, NIPS.

[91]  J. Spall Multivariate stochastic approximation using a simultaneous perturbation gradient approximation , 1992 .

[92]  R. Douglas,et al.  A silicon neuron , 1991, Nature.

[93]  Carver A. Mead,et al.  Neuromorphic electronic systems , 1990, Proc. IEEE.

[94]  D. Psaltis,et al.  Holography in artificial neural networks , 1990, Nature.

[95]  D. Brady,et al.  Adaptive optical networks using photorefractive crystals. , 1988, Applied optics.

[96]  A. Sideris,et al.  A multilayered neural network controller , 1988, IEEE Control Systems Magazine.

[97]  J N Lee,et al.  Optical implementations of associative networks with versatile adaptive learning capabilities. , 1987, Applied optics.

[98]  Demetri Psaltis,et al.  Multilayer Optical Learning Networks , 1987, Photonics West - Lasers and Applications in Science and Engineering.

[99]  D Psaltis,et al.  Optical implementation of the Hopfield model. , 1985, Applied optics.

[100]  Geoffrey E. Hinton,et al.  A Learning Algorithm for Boltzmann Machines , 1985, Cogn. Sci..

[101]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[102]  L. Chua Memristor-The missing circuit element , 1971 .

[103]  W. Maass,et al.  What makes a dynamical system computationally powerful ? , 2022 .