Estimating Lipschitz constants of monotone deep equilibrium models

Several methods have been proposed in recent years to provide bounds on the Lipschitz constants of deep networks, which can be used to provide robustness guarantees, generalization bounds, and characterize the smoothness of decision boundaries. However, existing bounds get substantially weaker with increasing depth of the network, which makes it unclear how to apply such bounds to recently proposed models such as the deep equilibrium (DEQ) model, which can be viewed as representing an infinitely-deep network. In this paper, we show that monotone DEQs, a recently-proposed subclass of DEQs, have Lipschitz constants that can be bounded as a simple function of the strong monotonicity parameter of the network. We derive simple-yet-tight bounds on both the input-output mapping and the weight-output mapping defined by these networks, and demonstrate that they are small relative to those for comparable standard DNNs. We show that one can use these bounds to design monotone DEQ models, even with e.g. multi-scale convolutional structure, that still have constraints on the Lipschitz constant. We also highlight how to use these bounds to develop PAC-Bayes generalization bounds that do not depend on any depth of the network, and which avoid the exponential depth-dependence of comparable DNN bounds.

[1]  J. Zico Kolter,et al.  Deterministic PAC-Bayesian generalization bounds for deep networks via generalizing noise-resilience , 2019, ICLR.

[2]  Masashi Sugiyama,et al.  Lipschitz-Margin Training: Scalable Certification of Perturbation Invariance for Deep Neural Networks , 2018, NeurIPS.

[3]  Paul Rolland,et al.  Lipschitz constant estimation of Neural Networks via sparse polynomial optimization , 2020, ICLR.

[4]  Matthias Bethge,et al.  Foolbox Native: Fast adversarial attacks to benchmark the robustness of machine learning models in PyTorch, TensorFlow, and JAX , 2020, J. Open Source Softw..

[5]  Calypso Herrera,et al.  Estimating Full Lipschitz Constants of Deep Neural Networks , 2020, ArXiv.

[6]  J. Z. Kolter,et al.  Monotone operator equilibrium networks , 2020, NeurIPS.

[7]  David Duvenaud,et al.  Neural Ordinary Differential Equations , 2018, NeurIPS.

[8]  Kevin Scaman,et al.  Lipschitz regularity of deep neural networks: analysis and efficient estimation , 2018, NeurIPS.

[9]  David A. McAllester,et al.  A PAC-Bayesian Approach to Spectrally-Normalized Margin Bounds for Neural Networks , 2017, ICLR.

[10]  Javad Lavaei,et al.  Stability-Certified Reinforcement Learning: A Control-Theoretic Perspective , 2018, IEEE Access.

[11]  Maneesh Kumar Singh,et al.  On Lipschitz Bounds of General Convolutional Neural Networks , 2018, IEEE Transactions on Information Theory.

[12]  W. Brendel,et al.  Foolbox: A Python toolbox to benchmark the robustness of machine learning models , 2017 .

[13]  Patrick L. Combettes,et al.  Lipschitz Certificates for Layered Network Structures Driven by Averaged Activation Operators , 2019, SIAM J. Math. Data Sci..

[14]  Matthias Bethge,et al.  Foolbox v0.8.0: A Python toolbox to benchmark the robustness of machine learning models , 2017, ArXiv.

[15]  Manfred Morari,et al.  Efficient and Accurate Estimation of Lipschitz Constants for Deep Neural Networks , 2019, NeurIPS.

[16]  Aditi Raghunathan,et al.  Certified Defenses against Adversarial Examples , 2018, ICLR.

[17]  Matus Telgarsky,et al.  Spectrally-normalized margin bounds for neural networks , 2017, NIPS.

[18]  J. Z. Kolter,et al.  Deep Equilibrium Models , 2019, NeurIPS.

[19]  Inderjit S. Dhillon,et al.  Towards Fast Computation of Certified Robustness for ReLU Networks , 2018, ICML.

[20]  P. L. Combettes,et al.  Lipschitz Certificates for Neural Network Structures Driven by Averaged Activation Operators , 2019 .

[21]  Stephen P. Boyd,et al.  A Primer on Monotone Operator Methods , 2015 .

[22]  Vladlen Koltun,et al.  Multiscale Deep Equilibrium Models , 2020, NeurIPS.

[23]  Joan Bruna,et al.  Intriguing properties of neural networks , 2013, ICLR.

[24]  Laurent El Ghaoui,et al.  Implicit Deep Learning , 2019, SIAM J. Math. Data Sci..