暂无分享,去创建一个
[1] D K Smith,et al. Numerical Optimization , 2001, J. Oper. Res. Soc..
[2] Sergey Ioffe,et al. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift , 2015, ICML.
[3] Joshua B. Tenenbaum,et al. End-to-End Differentiable Physics for Learning and Control , 2018, NeurIPS.
[4] Yoshua Bengio,et al. Tackling Climate Change with Machine Learning , 2019, ACM Comput. Surv..
[5] Gokcen Kestor,et al. Smart-PGSim: Using Neural Network to Accelerate AC-OPF Power Grid Simulation , 2020, SC20: International Conference for High Performance Computing, Networking, Storage and Analysis.
[6] Slawomir Koziel,et al. Surrogate-Based Modeling and Optimization , 2013 .
[7] J. Zico Kolter,et al. What game are we playing? End-to-end learning in normal and extensive form games , 2018, IJCAI.
[8] Yoshua Bengio,et al. Machine Learning for Combinatorial Optimization: a Methodological Tour d'Horizon , 2018, Eur. J. Oper. Res..
[9] Natalia Gimelshein,et al. PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.
[10] Vladlen Koltun,et al. Deep Equilibrium Models , 2019, NeurIPS.
[11] Priya L. Donti,et al. SATNet: Bridging deep learning and logical reasoning using a differentiable satisfiability solver , 2019, ICML.
[12] Andreas Krause,et al. Differentiable Learning of Submodular Models , 2017, NIPS 2017.
[13] Max Welling,et al. Group Equivariant Convolutional Networks , 2016, ICML.
[14] R. Hartley,et al. Deep Declarative Networks: A New Hope , 2019, ArXiv.
[15] Stephen P. Boyd,et al. OSQP: an operator splitting solver for quadratic programs , 2017, 2018 UKACC 12th International Conference on Control (CONTROL).
[16] David Duvenaud,et al. Neural Ordinary Differential Equations , 2018, NeurIPS.
[17] Jason Yosinski,et al. Hamiltonian Neural Networks , 2019, NeurIPS.
[18] Daniel Cremers,et al. Homogeneous Linear Inequality Constraints for Neural Network Activations , 2019, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).
[19] Line Roald,et al. Learning for Constrained Optimization: Identifying Optimal Active Constraint Sets , 2018, INFORMS J. Comput..
[20] Amin Kargarian,et al. A Survey on Applications of Machine Learning for Optimal Power Flow , 2020, 2020 IEEE Texas Power and Energy Conference (TPEC).
[21] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[22] WächterAndreas,et al. On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming , 2006 .
[23] Michael I. Jordan,et al. First-order methods almost always avoid saddle points: The case of vanishing step-sizes , 2019, NeurIPS.
[24] Pierre Gentine,et al. Achieving Conservation of Energy in Neural Network Emulators for Climate Modeling , 2019, ArXiv.
[25] Kyri Baker,et al. Learning Warm-Start Points For Ac Optimal Power Flow , 2019, 2019 IEEE 29th International Workshop on Machine Learning for Signal Processing (MLSP).
[26] Kyri Baker,et al. Learning Optimal Solutions for Extremely Fast AC Optimal Power Flow , 2019, 2020 IEEE International Conference on Communications, Control, and Computing Technologies for Smart Grids (SmartGridComm).
[27] Stephen P. Boyd,et al. Differentiable Convex Optimization Layers , 2019, NeurIPS.
[28] Andreas Krause,et al. Differentiable Submodular Maximization , 2018, IJCAI.
[29] Stephen P. Boyd,et al. Solution refinement at regular points of conic problems , 2018, Computational Optimization and Applications.
[30] Milind Tambe,et al. Melding the Data-Decisions Pipeline: Decision-Focused Learning for Combinatorial Optimization , 2018, AAAI.
[31] J. Zico Kolter,et al. OptNet: Differentiable Optimization as a Layer in Neural Networks , 2017, ICML.
[32] Priya L. Donti,et al. Task-based End-to-end Model Learning in Stochastic Optimization , 2017, NIPS.