How Neural Nets Work
暂无分享,去创建一个
There is presently great interest in the abilities of neural networks to mimic "qualitative reasoning" by manipulating neural incodings of symbols. Less work has been performed on using neural networks to process floating point numbers and it is sometimes stated that neural networks are somehow inherently inaccurate and therefore best suited for "fuzzy" qualitative reasoning. Nevertheless, the potential speed of massively parallel operations make neural net "number crunching" an interesting topic to explore. In this paper we discuss some of our work in which we demonstrate that for certain applications neural networks can achieve significantly higher numerical accuracy than more conventional techniques. In particular, prediction of future values of a chaotic time series can be performed with exceptionally high accuracy. We analyze how a neural net is able to do this, and in the process show that a large class of functions from Rn → Rm may be accurately approximated by a backpropagation neural net with just two "hidden" layers. The network uses this functional approximation to perform either interpolation (signal processing applications) or extrapolation (symbol processing applications). Neural nets therefore use quite familiar methods to perform their tasks. The geometrical viewpoint advocated here seems to be a useful approach to analyzing neural network operation and relates neural networks to well studied topics in functional approximation.
[1] F. Takens. Detecting strange attractors in turbulence , 1981 .
[2] James L. McClelland,et al. Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .
[3] Lawrence D. Jackel,et al. Large Automatic Learning, Rule Extraction, and Generalization , 1987, Complex Syst..
[4] A. Lapedes,et al. Nonlinear signal processing using neural networks: Prediction and system modelling , 1987 .