Many similarities exist between the class of nonlinear filters called stack filters and neural networks. In this chapter, we describe the relationships between these areas and how the theory of stack filters may be applied to the study of neural nets. The concept of a neural network represents one of the most original ideas ever to appear in the field of computing and artificial intelligence. However, many of the tools used to study these architectures are not innovative, but are the same tools used to analyze more conventional systems. Some of these tools are the mean squared error measure, orthogonality of functions, and optimization by minimizing energy, and correlation techniques. These are the same tools that have largely failed modern Al as well as the study of nonlinear systems. We believe these classical tools of linear theory will be of limited use in the study of neural systems.
[1]
S. G. Tyan,et al.
Median Filtering: Deterministic Properties
,
1981
.
[2]
F. Kuhlmann,et al.
On Second Moment Properties of Median Filtered Sequences of Independent Data
,
1981,
IEEE Trans. Commun..
[3]
G. Wise,et al.
A theoretical analysis of the properties of median filters
,
1981
.
[4]
B. Gold,et al.
A Comparison of Hamming and Hopfield Neural Nets for Pattern Classification
,
1987
.
[5]
T. Nodes,et al.
Median filters: Some modifications and their properties
,
1982
.
[6]
John W. Tukey,et al.
Exploratory Data Analysis.
,
1979
.
[7]
J. Bednar,et al.
Alpha-trimmed means and their relationship to median filters
,
1984
.