Almost Surely Stable Deep Dynamics
暂无分享,去创建一个
R. Bhushan Gopaluni | Philip D. Loewen | R. B. Gopaluni | Michael G. Forbes | Nathan P. Lawrence | Johan U. Backström | M. Forbes | J. Backström
[1] Eldad Haber,et al. Reversible Architectures for Arbitrarily Deep Residual Neural Networks , 2017, AAAI.
[2] Brian D. O. Anderson,et al. Lyapunov Criterion for Stochastic Systems and Its Applications in Distributed Computation , 2019, IEEE Transactions on Automatic Control.
[3] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[4] Lei Xu,et al. Input Convex Neural Networks : Supplementary Material , 2017 .
[5] Sandra Hirche,et al. Equilibrium distributions and stability analysis of Gaussian Process State Space Models , 2016, 2016 IEEE 55th Conference on Decision and Control (CDC).
[6] J. Zico Kolter,et al. Learning Stable Deep Dynamics Models , 2020, NeurIPS.
[7] Moritz Hardt,et al. Stable Recurrent Models , 2018, ICLR.
[8] Frank Kozin,et al. A survey of stability of stochastic systems , 1969, Autom..
[9] Sicun Gao,et al. Neural Lyapunov Control , 2020, NeurIPS.
[10] Harold J. Kushner,et al. A partial history of the early development of continuous-time nonlinear stochastic systems theory , 2014, Autom..
[11] Ruggero Carli,et al. Lyapunov Theory for Discrete Time Systems , 2018, 1809.05289.
[12] Alex Graves,et al. Generating Sequences With Recurrent Neural Networks , 2013, ArXiv.
[13] H. Kushner. ON THE STABILITY OF STOCHASTIC DYNAMICAL SYSTEMS. , 1965, Proceedings of the National Academy of Sciences of the United States of America.
[14] P. Olver. Nonlinear Systems , 2013 .
[15] Yuanyuan Shi,et al. Optimal Control Via Neural Networks: A Convex Approach , 2018, ICLR.
[16] A. J. Roberts,et al. Modify the Improved Euler scheme to integrate stochastic differential equations , 2012, 1210.0933.
[17] W. Rudin. Principles of mathematical analysis , 1964 .
[18] Laurent El Ghaoui,et al. Implicit Deep Learning , 2019, SIAM J. Math. Data Sci..
[19] Sandra Hirche,et al. Stability of Gaussian process state space models , 2016, 2016 European Control Conference (ECC).
[20] Anders P. Eriksson,et al. Implicitly Defined Layers in Neural Networks , 2020, ArXiv.
[21] Vladlen Koltun,et al. Deep Equilibrium Models , 2019, NeurIPS.
[22] R. Kalman,et al. Control system analysis and design via the second method of lyapunov: (I) continuous-time systems (II) discrete time systems , 1959 .
[23] Aude Billard,et al. Learning Stable Nonlinear Dynamical Systems With Gaussian Mixture Models , 2011, IEEE Transactions on Robotics.
[24] Suiyang Khoo,et al. Some properties of finite-time stable stochastic nonlinear systems , 2015, Appl. Math. Comput..
[25] Eldad Haber,et al. Stable architectures for deep neural networks , 2017, ArXiv.
[26] Ofir Nachum,et al. A Lyapunov-based Approach to Safe Reinforcement Learning , 2018, NeurIPS.
[27] Tengyu Ma,et al. Gradient Descent Learns Linear Dynamical Systems , 2016, J. Mach. Learn. Res..
[28] Ruslan Salakhutdinov,et al. Learning Stochastic Feedforward Neural Networks , 2013, NIPS.
[29] Andreas Krause,et al. The Lyapunov Neural Network: Adaptive Stability Certification for Safe Learning of Dynamical Systems , 2018, CoRL.
[30] Stephen P. Boyd,et al. Differentiable Convex Optimization Layers , 2019, NeurIPS.
[31] Sandra Hirche,et al. Learning Stable Stochastic Nonlinear Dynamical Systems , 2017, ICML.
[32] S. Srihari. Mixture Density Networks , 1994 .
[33] David J. Fleet,et al. Gaussian Process Dynamical Models , 2005, NIPS.
[34] Julien Cornebise,et al. Weight Uncertainty in Neural Networks , 2015, ArXiv.
[35] J. Zico Kolter,et al. OptNet: Differentiable Optimization as a Layer in Neural Networks , 2017, ICML.
[36] Marcello Farina,et al. LSTM Neural Networks: Input to State Stability and Probabilistic Safety Verification , 2019, L4DC.
[37] Heiga Zen,et al. Deep mixture density networks for acoustic modeling in statistical parametric speech synthesis , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[38] Andreas Krause,et al. Safe Model-based Reinforcement Learning with Stability Guarantees , 2017, NIPS.
[39] David Duvenaud,et al. Neural Ordinary Differential Equations , 2018, NeurIPS.
[40] David Duvenaud,et al. Neural Networks with Cheap Differential Operators , 2019, NeurIPS.
[41] Samet Oymak,et al. Stochastic Gradient Descent Learns State Equations with Nonlinear Activations , 2018, COLT.
[42] Natalia Gimelshein,et al. PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.