On the Inverse Pattern Recognition Problem in the Context of the Time-Series Data Processing with Memristor Networks

The implementation problem deals with identifying computations that can be performed by a given physical system. This issue is strongly related to the problem of describing a computing capacity of the system. In this chapter, these issues have been addressed in the context of on-line (real-time) pattern recognition of time series data, where memristor networks are used in the reservoir computing setup to perform information processing. Instead of designing a network that can solve a particular task, an inverse question has been addressed: Given a network of a certain design, which signals might it be particularly adept at recognizing? Several key theoretical concepts have been identified and formalised. This enabled us to approach the problem in a rigorous mathematical way: The problem has been formulated as an optimization problem, and a suitable algorithm for solving it has been suggested. The algorithm has been implemented as computer software: For a given network description the software produces the time-dependent voltage patterns (signals) that can be best recognized by the network. These patterns are found by performing a directed random search in the space of input signals. As an illustration of how to use the algorithm, we systematically investigated all networks containing up to four memristors.

[1]  David J. Chalmers,et al.  Does a rock implement every finite-state automaton? , 1996, Synthese.

[2]  Alireza Goudarzi,et al.  Towards a Calculus of Echo State Networks , 2014, BICA.

[3]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[4]  Massimiliano Di Ventra,et al.  Solving mazes with memristors: a massively-parallel approach , 2011, Physical review. E, Statistical, nonlinear, and soft matter physics.

[5]  Massimiliano Di Ventra,et al.  Experimental demonstration of associative memory with memristive neural networks , 2009, Neural Networks.

[6]  S. Shen-Orr,et al.  Network motifs: simple building blocks of complex networks. , 2002, Science.

[7]  Jens Bürger,et al.  Variation-tolerant Computing with Memristive Reservoirs , 2013, 2013 IEEE/ACM International Symposium on Nanoscale Architectures (NANOARCH).

[8]  Z. Konkoli,et al.  A generic simulator for large networks of memristive elements. , 2013, Nanotechnology.

[9]  Zoran Konkoli,et al.  A Perspective on Putnam's Realizability Theorem in the Context of Unconventional Computation , 2015, Int. J. Unconv. Comput..

[10]  Andrew Adamatzky,et al.  Molecular Computing , 2003 .

[11]  B. Schrauwen,et al.  Reservoir computing and extreme learning machines for non-linear time-series data analysis , 2013, Neural Networks.

[12]  Mikel L. Forcada,et al.  Learning the Initial State of a Second-Order Recurrent Neural Network during Regular-Language Inference , 1995, Neural Computation.

[13]  Christof Teuscher,et al.  Memristor-based reservoir computing , 2012, 2012 IEEE/ACM International Symposium on Nanoscale Architectures (NANOARCH).

[14]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.