On The Verification of Neural ODEs with Stochastic Guarantees

We show that Neural ODEs, an emerging class of time-continuous neural networks, can be verified by solving a set of global-optimization problems. For this purpose, we introduce Stochastic Lagrangian Reachability (SLR), an abstraction-based technique for constructing a tight Reachtube (an over-approximation of the set of reachable states over a given time-horizon), and provide stochastic guarantees in the form of confidence intervals for the Reachtube bounds. SLR inherently avoids the infamous wrapping effect (accumulation of over-approximation errors) by performing local optimization steps to expand safe regions instead of repeatedly forward-propagating them as is done by deterministic reachability methods. To enable fast local optimizations, we introduce a novel forward-mode adjoint sensitivity method to compute gradients without the need for backpropagation. Finally, we establish asymptotic and non-asymptotic convergence rates for SLR.

[1]  Martin Fränzle,et al.  Engineering constraint solvers for automatic analysis of probabilistic hybrid automata , 2010, J. Log. Algebraic Methods Program..

[2]  Alexandre Donzé,et al.  Breach, A Toolbox for Verification and Parameter Synthesis of Hybrid Systems , 2010, CAV.

[3]  Murat Arcak,et al.  TIRA: toolbox for interval reachability analysis , 2019, HSCC.

[4]  Radu Grosu,et al.  Tight Continuous-Time Reachtubes for Lagrangian Reachability , 2018, 2018 IEEE Conference on Decision and Control (CDC).

[5]  Kevin Scaman,et al.  Lipschitz regularity of deep neural networks: analysis and efficient estimation , 2018, NeurIPS.

[6]  A. Spencer Continuum Mechanics , 1967, Nature.

[7]  Oded Maler,et al.  Systematic Simulation Using Sensitivity Analysis , 2007, HSCC.

[8]  Paolo Zuliani,et al.  ProbReach: verified probabilistic delta-reachability for stochastic hybrid systems , 2014, HSCC.

[9]  A. Zhigljavsky Stochastic Global Optimization , 2008, International Encyclopedia of Statistical Science.

[10]  Tiziana Margaria,et al.  Tools and algorithms for the construction and analysis of systems: a special issue for TACAS 2017 , 2001, International Journal on Software Tools for Technology Transfer.

[11]  Dongxu Li,et al.  Reachability Analysis of Nonlinear Systems Using Hybridization and Dynamics Scaling , 2020, FORMATS.

[12]  James Kapinski,et al.  Simulation-Driven Reachability Using Matrix Measures , 2017, ACM Trans. Embed. Comput. Syst..

[13]  Radu Grosu,et al.  Lagrangian Reachtubes: The Next Generation , 2020, 2020 59th IEEE Conference on Decision and Control (CDC).

[14]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[15]  Xin Chen,et al.  Flow*: An Analyzer for Non-linear Hybrid Systems , 2013, CAV.

[16]  Radu Grosu,et al.  Liquid Time-constant Networks , 2020, AAAI.

[17]  B. Shubert A Sequential Method Seeking the Global Maximum of a Function , 1972 .

[18]  Svetlana Stepanenko,et al.  Global Optimization Methods Based on Tabu Search , 2009 .

[19]  Nikolaus Hansen,et al.  Completely Derandomized Self-Adaptation in Evolution Strategies , 2001, Evolutionary Computation.

[20]  Murat Arcak,et al.  PIRK: Scalable Interval Reachability Analysis for High-Dimensional Nonlinear Systems , 2020, CAV.

[21]  Edmund M. Clarke,et al.  Satisfiability modulo ODEs , 2013, 2013 Formal Methods in Computer-Aided Design.

[22]  M. L. Chambers The Mathematical Theory of Optimal Processes , 1965 .

[23]  Nicolas Vayatis,et al.  Global optimization of Lipschitz functions , 2017, ICML.

[24]  Stefan Roth,et al.  Covariance Matrix Adaptation for Multi-objective Optimization , 2007, Evolutionary Computation.

[25]  Paolo Zuliani,et al.  ProbReach: A Tool for Guaranteed Reachability Analysis of Stochastic Hybrid Systems , 2015, SNR@CAV.

[26]  Fabian Immler,et al.  Verified Reachability Analysis of Continuous Systems , 2015, TACAS.

[27]  David Duvenaud,et al.  Neural Ordinary Differential Equations , 2018, NeurIPS.

[28]  Xin Chen,et al.  Probabilistic Safety Verification of Stochastic Hybrid Systems Using Barrier Certificates , 2017, ACM Trans. Embed. Comput. Syst..

[29]  J. Duncan,et al.  Adaptive Checkpoint Adjoint Method for Gradient Estimation in Neural ODE , 2020, ICML.

[30]  J. Achenbach THE LINEARIZED THEORY OF ELASTICITY , 1973 .

[31]  Omri Azencot,et al.  Lipschitz Recurrent Neural Networks , 2020, ICLR.

[32]  Terry Lyons,et al.  Neural Controlled Differential Equations for Irregular Time Series , 2020, NeurIPS.

[33]  Guido Sanguinetti,et al.  A Statistical Approach for Computing Reachability of Non-linear and Stochastic Dynamical Systems , 2014, QEST.

[34]  A. Neumaier Complete search in continuous global optimization and constraint satisfaction , 2004, Acta Numerica.

[35]  Zhouchen Lin,et al.  Dynamical System Inspired Adaptive Time Stepping Controller for Residual Network Families , 2019, AAAI.

[36]  Mark A. Stadtherr,et al.  Verified Solution and Propagation of Uncertainty in Physiological Models , 2011, Reliab. Comput..

[37]  David Duvenaud,et al.  Latent Ordinary Differential Equations for Irregularly-Sampled Time Series , 2019, NeurIPS.

[38]  Mahesh Viswanathan,et al.  C2E2: A Verification Tool for Stateflow Models , 2015, TACAS.

[39]  E. Hansen Global optimization using interval analysis — the multi-dimensional case , 1980 .

[40]  Yoel Drori,et al.  The exact information-based complexity of smooth convex minimization , 2016, J. Complex..

[41]  Radu Grosu,et al.  Neural circuit policies enabling auditable autonomy , 2020, Nature Machine Intelligence.

[42]  Yee Whye Teh,et al.  Augmented Neural ODEs , 2019, NeurIPS.

[43]  Austin R. Benson,et al.  Neural Jump Stochastic Differential Equations , 2019, NeurIPS.

[44]  G. T. Timmer,et al.  Stochastic global optimization methods part II: Multi level methods , 1987, Math. Program..

[45]  G. T. Timmer,et al.  Stochastic global optimization methods part I: Clustering methods , 1987, Math. Program..

[46]  Lijun Zhang,et al.  Measurability and safety verification for stochastic hybrid systems , 2011, HSCC '11.

[47]  Radu Grosu,et al.  Under the Hood of a Stand-Alone Lagrangian Reachability Tool , 2019, ARCH@CPSIoTWeek.

[48]  Iain Murray,et al.  Neural Spline Flows , 2019, NeurIPS.

[49]  Adam M. Oberman,et al.  How to Train Your Neural ODE: the World of Jacobian and Kinetic Regularization , 2020, ICML.

[50]  Daniel Wilczak,et al.  CAPD: : DynSys: a flexible C++ toolbox for rigorous numerical analysis of dynamical systems , 2020, Commun. Nonlinear Sci. Numer. Simul..

[51]  Radu Grosu,et al.  A Natural Lottery Ticket Winner: Reinforcement Learning with Ordinary Neural Circuits , 2020, ICML.

[52]  Vladlen Koltun,et al.  Learning to Control PDEs with Differentiable Physics , 2020, ICLR.

[53]  Edmund M. Clarke,et al.  SReach: A Probabilistic Bounded Delta-Reachability Analyzer for Stochastic Hybrid Systems , 2015, CMSB.

[54]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[55]  S. A. Piyavskii An algorithm for finding the absolute extremum of a function , 1972 .

[56]  Piotr Zgliczynski,et al.  C1 Lohner Algorithm , 2002, Found. Comput. Math..

[57]  S. Li Concise Formulas for the Area and Volume of a Hyperspherical Cap , 2011 .

[58]  Alexander H. G. Rinnooy Kan,et al.  A stochastic method for global optimization , 1982, Math. Program..

[59]  J. Nagy,et al.  Steepest Descent, CG, and Iterative Regularization of Ill-Posed Problems , 2003 .

[60]  Vincent Y. F. Tan,et al.  On Robustness of Neural Ordinary Differential Equations , 2020, ICLR.

[61]  Mathias Lechner,et al.  Learning Long-Term Dependencies in Irregularly-Sampled Time Series , 2020, NeurIPS.

[62]  Radu Grosu,et al.  Lagrangian Reachabililty , 2017, CAV.