Back-propagation is not Efficient
暂无分享,去创建一个
[1] Landsborough Thomson. [Book Reviews] , 1962, Nature.
[2] David S. Johnson,et al. Computers and Intractability: A Guide to the Theory of NP-Completeness , 1978 .
[3] Geoffrey E. Hinton,et al. Learning representations by back-propagating errors , 1986, Nature.
[4] Gerald Tesauro,et al. Scaling Relationships in Back-Propagation Learning: Dependence on Training Set Size , 1987, Complex Syst..
[5] R. Fletcher. Practical Methods of Optimization , 1988 .
[6] Gerald Tesauro,et al. Scaling Relationships in Back-propagation Learning , 1988, Complex Syst..
[7] Eric B. Baum,et al. A Polynomial Time Algorithm That Learns Two Hidden Unit Nets , 1990, Neural Computation.
[8] J. Stephen Judd,et al. Neural network design and the complexity of learning , 1990, Neural network modeling and connectionism.
[9] Ronald L. Rivest,et al. Training a 3-node neural network is NP-complete , 1988, COLT '88.
[10] Nicholas J. Redding,et al. Constructive higher-order network that is polynomial time , 1993, Neural Networks.
[11] Herbert Wiklicky. The neural network loading problem is undecidable , 1994 .
[12] Eduardo D. Sontag,et al. Finiteness results for sigmoidal “neural” networks , 1993, STOC.
[13] Klaus-Uwe Höffgen,et al. Computational Limitations on Training Sigmoid Neural Networks , 1993, Information Processing Letters.
[14] Jirí Síma,et al. Loading Deep Networks Is Hard , 1994, Neural Comput..
[15] Wolfgang Maass,et al. Perspectives of Current Research about the Complexity of Learning on Neural Nets , 1994 .
[16] Paul C. Kainen,et al. Functionally Equivalent Feedforward Neural Networks , 1994, Neural Computation.
[17] Hava T. Siegelmann,et al. On the complexity of training neural networks with continuous activation functions , 1995, IEEE Trans. Neural Networks.