An MDL-principled evolutionary mechanism to automatic architecturing of pattern recognition neural network
暂无分享,去创建一个
A minimum-description-length (MDL) principled evolutionary mechanism to automatic architecturing multilayer feedforward (MLFF) neural network is proposed. This type of neural networks is considered as a generic system implementing a generic model. The final network resultant from the architecturing and training is seen as an instance of this generic system, and thus an implemented instance of the generic model. Disregarding the hardware fault tolerance, the description length of the network and that of the performance deviation from the ideal given training samples must be smaller than that of the original samples. This total description length must be the minimum among all possible states. Constrained by the MDL principle, an MLFF neural network can be automatically architectured and trained through an evolutionary mechanism in which the network will be allowed and enabled to automatically expand as well as to reduce its complexity of architecture. The resultant network is of a partially connected MLFF architecture.<<ETX>>
[1] Pan He Ping. A spatial structure theory in machine vision and its application to structural and textural analysis of remotely sensed images , 1990 .
[2] D. Sprecher. On the structure of continuous functions of several variables , 1965 .
[3] Kurt Hornik,et al. Multilayer feedforward networks are universal approximators , 1989, Neural Networks.
[4] Ken-ichi Funahashi,et al. On the approximate realization of continuous mappings by neural networks , 1989, Neural Networks.