The problem of how a single binary higher-order unit can learn an arbitrary Boolean function of arbitrarily many variables is discussed. A solution of this problem which exploits all correlations of any order between multiple inputs and their related single outputs is presented. A certain correlation coefficient is given by choosing a certain subset of the set of all inputs, multiplying the corresponding input values, and determining the correlation between this product and the output of the Boolean function. The order of this correlation is given by the number-of elements contained in the chosen subset. The number of correlation coefficients to be evaluated equals the number of elements of the power set (set of all subsets) of the set of inputs. The unit is adapted to the given mapping by determining its synaptic weights via a generalized Hebb rule. These weights are exactly the above-mentioned correlation coefficients. Learning is achieved in a one-shot manner, thus each instance of the mapping is presented to the unit exactly one time. A proof is given that this higher-order unit can retrieve the correct function without errors, provided all correlations are known
[1]
Stephen Wolfram,et al.
Theory and Applications of Cellular Automata
,
1986
.
[2]
Michael L. Dertouzos,et al.
Threshold Logic: A Synthesis Approach
,
1965
.
[3]
Richard Baker,et al.
A Spatially-Oriented Information Processor which Simulates the Motions of Rigid Objects
,
1973,
Artif. Intell..
[4]
David E. Goldberg,et al.
Genetic Algorithms and Walsh Functions: Part I, A Gentle Introduction
,
1989,
Complex Syst..
[5]
A. A. Mullin,et al.
Principles of neurodynamics
,
1962
.
[6]
Chris Langton,et al.
Artificial Life
,
2017,
Encyclopedia of Machine Learning and Data Mining.
[7]
Colin Giles,et al.
Learning, invariance, and generalization in high-order neural networks.
,
1987,
Applied optics.