A neural network that embeds its own meta-levels
暂无分享,去创建一个
[1] Jürgen Schmidhuber,et al. A Fixed Size Storage O(n3) Time Complexity Learning Algorithm for Fully Recurrent Continually Running Networks , 1992, Neural Computation.
[2] Ronald J. Williams,et al. Gradient-based learning algorithms for recurrent networks and their computational complexity , 1995 .
[3] Jurgen Schmidhuber. Steps Towards 'Self-Referential' Neural Learning: A Thought Experiment ; CU-CS-627-92 , 1992 .
[4] Fernando J. Pineda,et al. Time Dependent Adaptive Neural Networks , 1989, NIPS.
[5] Barak A. Pearlmutter. Learning State Space Trajectories in Recurrent Neural Networks , 1989, Neural Computation.
[6] PAUL J. WERBOS,et al. Generalization of backpropagation with application to a recurrent gas market model , 1988, Neural Networks.
[7] J. Schmidhuber. Reducing the Ratio Between Learning Complexity and Number of Time Varying Variables in Fully Recurrent Nets , 1993 .
[8] Jürgen Schmidhuber,et al. Learning to Control Fast-Weight Memories: An Alternative to Dynamic Recurrent Networks , 1992, Neural Computation.
[9] Ronald J. Williams,et al. Gradient-Based Learning Algorithms for Recurrent Networks , 1989 .
[10] Ronald J. Williams,et al. A Learning Algorithm for Continually Running Fully Recurrent Neural Networks , 1989, Neural Computation.