A syntactic neural network is equivalent to a parser for a certain type of grammar-in this case, strictly hierarchical context-free. This allows an efficient method for pattern description and has the added advantage of being a generative model. The authors show how the network itself can infer the grammar. Syntactic neural nets can model stochastic or nonstochastic grammars. The stochastic nets are properly probabilistic and are powerful discriminators; the nonstochastic nets are less powerful, but have straightforward silicon implementations with existing technology. Learning in syntactic nets may proceed supervised or unsupervised. In each case, the algorithm is the same; the difference lies in the data presented to the net. In prior publications, the authors applied syntactic neural nets to character recognition and cursive script recognition. The authors presently show that nonstochastic nets can perform signature verification with high reliability. This raises the possibility of signature verification on a robust smart card
[1]
Teuvo Kohonen,et al.
Self-Organization and Associative Memory, Third Edition
,
1989,
Springer Series in Information Sciences.
[2]
John S. Bridle,et al.
Alpha-nets: A recurrent 'neural' network architecture with a hidden Markov model interpretation
,
1990,
Speech Commun..
[3]
Simon M. Lucas,et al.
Syntactic neural networks for speech technology
,
1990
.
[4]
Stephen Grossberg,et al.
Adaptive pattern classification and universal recoding: II. Feedback, expectation, olfaction, illusions
,
1976,
Biological Cybernetics.
[5]
J. Baker.
Trainable grammars for speech recognition
,
1979
.
[6]
Simon M. Lucas,et al.
A new learning paradigm for neural networks
,
1989
.