Evaluating hopfield-network-based linear solvers for hardware constrained neural substrates

Emerging neural hardware substrates, such as IBM's “TrueNorth” neurosynaptic system, can provide an appealing platform for deploying numerical algorithms. For example, a recurrent Hopfield neural network can be used to solve for the Moore-Penrose matrix inverse, which enables a broad class of linear optimizations to be solved efficiently, with very low energy cost. However, deploying numerical algorithms on hardware platforms that severely limit the range and precision of representation for numeric quantities can be quite challenging. This paper discusses these challenges and experimentally validates that generalized inverses can be correctly computed on such substrates. Specifically, we show that several real-time applications that require matrix inverse computations to solve systems of linear equations can be correctly implemented on hardware-constrained neural substrates. The Hopfield linear solver model is empirically validated on the IBM TrueNorth platform, and results show promising potential for deploying an accurate and energy-efficient generalized matrix inverse engine calculator, with compelling real-time applications including target tracking (object localization), optical flow, and inverse kinematics.

[1]  Andrew S. Cassidy,et al.  A million spiking-neuron integrated circuit with a scalable communication network and interface , 2014, Science.

[2]  George G. Lendaris,et al.  Linear Hopfield networks and constrained optimization , 1999, IEEE Trans. Syst. Man Cybern. Part B.

[3]  Tomaso A. Poggio,et al.  Learning with Group Invariant Features: A Kernel Perspective , 2015, NIPS.

[4]  Kunihiko Fukushima,et al.  Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position , 1980, Biological Cybernetics.

[5]  Mikko H. Lipasti,et al.  A self-learning map-seeking circuit for visual object recognition , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[6]  A. Charnes,et al.  Contributions to the Theory of Generalized Inverses , 1963 .

[7]  Andrew S. Cassidy,et al.  Cognitive computing systems: Algorithms and applications for networks of neurosynaptic cores , 2013, The 2013 International Joint Conference on Neural Networks (IJCNN).

[8]  Edmund T. Rolls,et al.  Invariant Visual Object and Face Recognition: Neural and Computational Bases, and a Model, VisNet , 2012, Front. Comput. Neurosci..

[9]  Andrew S. Cassidy,et al.  Cognitive computing building block: A versatile and efficient digital neuron model for neurosynaptic cores , 2013, The 2013 International Joint Conference on Neural Networks (IJCNN).

[10]  T. Poggio,et al.  Hierarchical models of object recognition in cortex , 1999, Nature Neuroscience.

[11]  Carver A. Mead,et al.  Neuromorphic electronic systems , 1990, Proc. IEEE.