暂无分享,去创建一个
[1] Phan-Minh Nguyen,et al. A Note on the Global Convergence of Multilayer Neural Networks in the Mean Field Regime , 2020, ArXiv.
[2] Phan-Minh Nguyen,et al. Analysis of feature learning in weight-tied autoencoders via the mean field lens , 2021, ArXiv.
[3] Justin A. Sirignano,et al. Mean field analysis of neural networks: A central limit theorem , 2018, Stochastic Processes and their Applications.
[4] Jianfeng Lu,et al. A Mean-field Analysis of Deep ResNet and Beyond: Towards Provable Optimization Via Overparameterization From Depth , 2020, ICML.
[5] Francis Bach,et al. On the Global Convergence of Gradient Descent for Over-parameterized Models using Optimal Transport , 2018, NeurIPS.
[6] E Weinan,et al. Stochastic Modified Equations and Dynamics of Stochastic Gradient Algorithms I: Mathematical Foundations , 2018, J. Mach. Learn. Res..
[7] 俊一 甘利. 5分で分かる!? 有名論文ナナメ読み:Jacot, Arthor, Gabriel, Franck and Hongler, Clement : Neural Tangent Kernel : Convergence and Generalization in Neural Networks , 2020 .
[8] Phan-Minh Nguyen,et al. Global Convergence of Three-layer Neural Networks in the Mean Field Regime , 2021, ICLR.
[9] École d'été de probabilités de Saint-Flour,et al. Ecole d'été de probabilités de Saint-Flour XIX, 1989 , 1991 .
[10] Grant M. Rotskoff,et al. Neural Networks as Interacting Particle Systems: Asymptotic Convexity of the Loss Landscape and Universal Scaling of the Approximation Error , 2018, ArXiv.
[11] Jianfeng Lu,et al. Global optimality of softmax policy gradient with single hidden layer neural networks in the mean-field regime , 2020, ICLR.
[12] Adel Javanmard,et al. Analysis of a Two-Layer Neural Network via Displacement Convexity , 2019, The Annals of Statistics.
[13] Taiji Suzuki,et al. Stochastic Particle Gradient Descent for Infinite Ensembles , 2017, ArXiv.
[14] Phan-Minh Nguyen,et al. A Rigorous Framework for the Mean Field Limit of Multilayer Neural Networks , 2020, ArXiv.
[15] Andrea Montanari,et al. A mean field view of the landscape of two-layer neural networks , 2018, Proceedings of the National Academy of Sciences.
[16] Cong Fang,et al. Modeling from Features: a Mean-field Framework for Over-parameterized Deep Neural Networks , 2020, COLT.
[17] Taiji Suzuki,et al. Particle dual averaging: optimization of mean field neural network with global convergence rate analysis , 2020, NeurIPS.
[18] Marco Mondelli,et al. Landscape Connectivity and Dropout Stability of SGD Solutions for Over-parameterized Neural Networks , 2020, ICML.
[19] Phan-Minh Nguyen,et al. Mean Field Limit of the Learning Dynamics of Multilayer Neural Networks , 2019, ArXiv.
[20] A. Sznitman. Topics in propagation of chaos , 1991 .
[21] Colin Wei,et al. Regularization Matters: Generalization and Optimization of Neural Nets v.s. their Induced Kernel , 2018, NeurIPS.
[22] Eugene A. Golikov,et al. Dynamically Stable Infinite-Width Limits of Neural Classifiers , 2020, ArXiv.