On the Differentiability of the Solution to Convex Optimization Problems
暂无分享,去创建一个
In this paper, we provide conditions under which one can take derivatives of the solution to convex optimization problems with respect to problem data. These conditions are (roughly) that Slater's condition holds, the functions involved are twice differentiable, and that a certain Jacobian matrix is non-singular. The derivation involves applying the implicit function theorem to the necessary and sufficient KKT system for optimality.
[1] Stephen P. Boyd,et al. Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.
[2] R. Rockafellar,et al. Implicit Functions and Solution Mappings , 2009 .
[3] J. Zico Kolter,et al. OptNet: Differentiable Optimization as a Layer in Neural Networks , 2017, ICML.