Analysis of Recurrent Neural Network and Predictions
暂无分享,去创建一个
Jieun Park | Dokkyun Yi | Sangmin Ji | Dokkyun Yi | Jieun Park | Sangmin Ji
[1] Weijie Ren,et al. A Hybrid Model Based on a Two-Layer Decomposition Approach and an Optimized Neural Network for Chaotic Time Series Prediction , 2019, Symmetry.
[2] Yu Zhang,et al. EEG classification using sparse Bayesian extreme learning machine for brain–computer interface , 2018, Neural Computing and Applications.
[3] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[4] SchmidhuberJürgen,et al. 2005 Special Issue , 2005 .
[5] Jürgen Schmidhuber,et al. A local learning algorithm for dynamic feedforward and recurrent networks , 1990, Forschungsberichte, TU Munich.
[6] Jürgen Schmidhuber,et al. A Fixed Size Storage O(n3) Time Complexity Learning Algorithm for Fully Recurrent Continually Running Networks , 1992, Neural Computation.
[7] Jürgen Schmidhuber,et al. Framewise phoneme classification with bidirectional LSTM and other neural network architectures , 2005, Neural Networks.
[8] Prudence W. H. Wong,et al. Hierarchical Meta-Learning in Time Series Forecasting for Improved Interference-Less Machine Learning , 2017, Symmetry.
[9] PAUL J. WERBOS,et al. Generalization of backpropagation with application to a recurrent gas market model , 1988, Neural Networks.
[10] Abdullah Al Khaled,et al. Fuzzy adaptive imperialist competitive algorithm for global optimization , 2014, Neural Computing and Applications.
[11] Jeffrey L. Elman,et al. Finding Structure in Time , 1990, Cogn. Sci..
[12] Geoffrey E. Hinton,et al. Learning representations by back-propagating errors , 1986, Nature.
[13] Yoshua Bengio,et al. Learning long-term dependencies with gradient descent is difficult , 1994, IEEE Trans. Neural Networks.
[14] Jürgen Schmidhuber,et al. Learning Precise Timing with LSTM Recurrent Networks , 2003, J. Mach. Learn. Res..
[15] Yu Zhang,et al. Multi-kernel extreme learning machine for EEG classification in brain-computer interfaces , 2018, Expert Syst. Appl..