Notes on the Symmetries of 2-Layer ReLU-Networks
暂无分享,去创建一个
[1] Razvan Pascanu,et al. Sharp Minima Can Generalize For Deep Nets , 2017, ICML.
[2] David Rolnick,et al. Identifying Weights and Architectures of Unknown ReLU Networks , 2019, ArXiv.
[3] Ruslan Salakhutdinov,et al. Path-SGD: Path-Normalized Optimization in Deep Neural Networks , 2015, NIPS.
[4] Christoph H. Lampert,et al. Functional vs. parametric equivalence of ReLU networks , 2020, ICLR.
[5] Paul C. Kainen,et al. Functionally Equivalent Feedforward Neural Networks , 1994, Neural Computation.
[6] Eduardo D. Sontag,et al. UNIQUENESS OF WEIGHTS FOR NEURAL NETWORKS , 1993 .
[7] Philipp Grohs,et al. How degenerate is the parametrization of neural networks with the ReLU activation function? , 2019, NeurIPS.
[8] Robert Hecht-Nielsen,et al. On the Geometry of Feedforward Neural Network Error Surfaces , 1993, Neural Computation.
[9] Héctor J. Sussmann,et al. Uniqueness of the weights for minimal feedforward nets with a given input-output map , 1992, Neural Networks.