Differential Hebbian learning

The differential Hebbian law ėij=Ċi Ċj is examined as an alternative to the traditional Hebbian law ėij =Ci Cj for updating edge connection strengths in neural networks. The motivation is that concurrent change, rather than just concurrent activation, more accurately captures the ‘‘concomitant variation’’ that is central to inductively inferred functional relationships. The resulting networks are characterized by a kinetic, rather than potential, energy. Yet we prove that both system energies are given by the same entropy‐like functional of connection matrices, Trace(Ė E). We prove that the differential Hebbian is equivalent to stochastic‐process correlation (a cross‐covariance kernel). We exactly solve the differential Hebbian law, interpret the sequence of edges as a stochastic process, and report that the edge process is a submartingale: the edges are expected to increase with time. The submartingale edges decompose into a martingale or unchanging process and an increasing or novelty process. Hence conditioned averages of edge residuals are encoded in learning though the network only ‘‘experiences’’ the unconditioned edge residuals.

[1]  David Hume,et al.  An inquiry concerning human understanding : a dissertation on the passions , 1817 .

[2]  J. Neumann Mathematical Foundations of Quantum Mechanics , 1955 .

[3]  J. M. Oshorn Proc. Nat. Acad. Sei , 1978 .

[4]  Stephen Grossberg,et al.  Studies of mind and brain , 1982 .

[5]  Satosi Watanabe,et al.  Pattern Recognition: Human and Mechanical , 1985 .

[6]  Teuvo Kohonen,et al.  Self-organization and associative memory: 3rd edition , 1989 .

[7]  E. Capaldi,et al.  The organization of behavior. , 1992, Journal of applied behavior analysis.

[8]  B. Krauskopf,et al.  Proc of SPIE , 2003 .

[9]  Adaptive Inference , 2022, Mathematical Foundations of Infinite-Dimensional Statistical Models.