Classification capabilities of architecture-specific recurrent networks
暂无分享,去创建一个
The classification capabilities of Elman and Jordan architecture-specific recurrent threshold networks are analyzed in terms of the number and possible types of cells the networks are capable of forming in the input and hidden activation spaces. For Elman networks the number of cells is always 2/sup h/, there are no dosed or imaginary cells, and they are therefore not capable of forming disconnected decision regions. For Jordan networks this is only the case when the number of hidden units are less or equal to the sum of input and state units. We have interpreted the equations obtained, compared the results with feedforward threshold networks, and illustrated them with an example.
[1] Jeffrey L. Elman,et al. Finding Structure in Time , 1990, Cogn. Sci..
[2] Michael I. Jordan. Attractor dynamics and parallelism in a connectionist sequential machine , 1990 .
[3] Jacques Ludik,et al. Training, dynamics, and complexity of architecture-specific recurrent neural networks , 1994 .
[4] A. El-Jaroudi,et al. Classification capabilities of two-layer neural nets , 1989, International Conference on Acoustics, Speech, and Signal Processing,.