Efficient forecasting of chaotic systems with block-diagonal and binary reservoir computing.
暂无分享,去创建一个
[1] C. Räth,et al. Breaking symmetries of the reservoir equations in echo state networks. , 2020, Chaos.
[2] Joschka Herteux,et al. Reducing network size and improving prediction stability of reservoir computing. , 2020, Chaos.
[3] D. Gauthier,et al. Forecasting Chaotic Systems with Very Low Connectivity Reservoir Computers , 2019, Chaos.
[4] Ribana Roscher,et al. Explainable Machine Learning for Scientific Insights and Discoveries , 2019, IEEE Access.
[5] Martin Gerlach,et al. Testing Statistical Laws in Complex Systems. , 2019, Physical review letters.
[6] Christoph Räth,et al. Good and bad predictions: Assessing and improving the replication of chaotic attractors by means of reservoir computing. , 2019, Chaos.
[7] Louis M. Pecora,et al. Network Structure Effects in Reservoir Computers , 2019, Chaos.
[8] Toshiyuki Yamane,et al. Recent Advances in Physical Reservoir Computing: A Review , 2018, Neural Networks.
[9] Yang Wang,et al. Manifold: A Model-Agnostic Framework for Interpretation and Diagnosis of Machine Learning Models , 2018, IEEE Transactions on Visualization and Computer Graphics.
[10] Edward Ott,et al. Attractor reconstruction by machine learning. , 2018, Chaos.
[11] Aaron Clauset,et al. Scale-free networks are rare , 2018, Nature Communications.
[12] Jaideep Pathak,et al. Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data. , 2017, Chaos.
[13] Vincent Dumoulin,et al. Generative Adversarial Networks: An Overview , 2017, 1710.07035.
[14] Harald Haas,et al. Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.
[15] Henry Markram,et al. Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.
[16] Albert-László Barabási,et al. Statistical mechanics of complex networks , 2001, ArXiv.
[17] Duncan J. Watts,et al. Collective dynamics of ‘small-world’ networks , 1998, Nature.
[18] Sepp Hochreiter,et al. The Vanishing Gradient Problem During Learning Recurrent Neural Nets and Problem Solutions , 1998, Int. J. Uncertain. Fuzziness Knowl. Based Syst..
[19] M. Rosenstein,et al. A practical method for calculating largest Lyapunov exponents from small data sets , 1993 .
[20] A. Wolf,et al. Determining Lyapunov exponents from a time series , 1985 .
[21] P. Grassberger. Generalized dimensions of strange attractors , 1983 .
[22] C. Paige. Bidiagonalization of Matrices and Solution of Linear Equations , 1974 .
[23] A. E. Hoerl,et al. Ridge Regression: Applications to Nonorthogonal Problems , 1970 .
[24] B. Mandelbrot. How Long Is the Coast of Britain? Statistical Self-Similarity and Fractional Dimension , 1967, Science.
[25] E. Lorenz. Deterministic nonperiodic flow , 1963 .
[26] P. Erdos,et al. On the evolution of random graphs , 1984 .
[27] Robert Shaw. Strange Attractors, Chaotic Behavior, and Information Flow , 1981 .