A Simple Yet Efficient Evolution Strategy for Large-Scale Black-Box Optimization
暂无分享,去创建一个
[1] Anne Auger,et al. Comparison-based natural gradient optimization in high dimension , 2014, GECCO.
[2] Ilya Loshchilov,et al. A computationally efficient limited memory CMA-ES for large scale optimization , 2014, GECCO.
[3] Petros Koumoutsakos,et al. Reducing the Time Complexity of the Derandomized Evolution Strategy with Covariance Matrix Adaptation (CMA-ES) , 2003, Evolutionary Computation.
[4] Nikolaus Hansen,et al. Completely Derandomized Self-Adaptation in Evolution Strategies , 2001, Evolutionary Computation.
[5] Xiaodong Li,et al. A Comparative Study of CMA-ES on Large Scale Global Optimisation , 2010, Australasian Conference on Artificial Intelligence.
[6] Shun-ichi Amari,et al. Natural Gradient Works Efficiently in Learning , 1998, Neural Computation.
[7] Bernhard Sendhoff,et al. Simplify Your Covariance Matrix Adaptation Evolution Strategy , 2017, IEEE Transactions on Evolutionary Computation.
[8] Hans-Georg Beyer,et al. Performance analysis of evolutionary optimization with cumulative step length adaptation , 2004, IEEE Transactions on Automatic Control.
[9] Antonio LaTorre,et al. Multiple Offspring Sampling in Large Scale Global Optimization , 2012, 2012 IEEE Congress on Evolutionary Computation.
[10] Anne Auger,et al. Impacts of invariance in search: When CMA-ES and PSO face ill-conditioned and non-separable problems , 2011, Appl. Soft Comput..
[11] Xin Yao,et al. Large scale evolutionary optimization using cooperative coevolution , 2008, Inf. Sci..
[12] Nikolaus Hansen,et al. Evaluating the CMA Evolution Strategy on Multimodal Test Functions , 2004, PPSN.
[13] Tom Schaul,et al. A linear time natural evolution strategy for non-separable functions , 2011, GECCO.
[14] Hans-Paul Schwefel,et al. Evolution and Optimum Seeking: The Sixth Generation , 1993 .
[15] M. Brand,et al. Fast low-rank modifications of the thin singular value decomposition , 2006 .
[16] Peter Tiño,et al. Scaling Up Estimation of Distribution Algorithms for Continuous Optimization , 2011, IEEE Transactions on Evolutionary Computation.
[17] Dirk V. Arnold,et al. Weighted multirecombination evolution strategies , 2006, Theor. Comput. Sci..
[18] Kenneth A. De Jong,et al. A Cooperative Coevolutionary Approach to Function Optimization , 1994, PPSN.
[19] Ata Kabán,et al. Toward Large-Scale Continuous EDA: A Random Matrix Theory Perspective , 2013, Evolutionary Computation.
[20] Nikolaus Hansen,et al. A restart CMA evolution strategy with increasing population size , 2005, 2005 IEEE Congress on Evolutionary Computation.
[21] Anne Auger,et al. A median success rule for non-elitist evolution strategies: study of feasibility , 2013, GECCO '13.
[22] Qingfu Zhang,et al. An efficient rank-1 update for Cholesky CMA-ES using auxiliary evolution path , 2017, 2017 IEEE Congress on Evolutionary Computation (CEC).
[23] Nikolaus Hansen,et al. The CMA Evolution Strategy: A Comparing Review , 2006, Towards a New Evolutionary Computation.
[24] Geoffrey E. Hinton,et al. Learning representations by back-propagating errors , 1986, Nature.
[25] Shahryar Rahnamayan,et al. Metaheuristics in large-scale global continues optimization: A survey , 2015, Inf. Sci..
[26] Eugenius Kaszkurewicz,et al. Steepest descent with momentum for quadratic functions is a version of the conjugate gradient method , 2004, Neural Networks.
[27] Tom Schaul,et al. High dimensions and heavy tails for natural evolution strategies , 2011, GECCO '11.
[28] Francisco Herrera,et al. MA-SW-Chains: Memetic algorithm based on local search chains for large scale continuous global optimization , 2010, IEEE Congress on Evolutionary Computation.
[29] Christian Igel,et al. A computational efficient covariance matrix update and a (1+1)-CMA for evolution strategies , 2006, GECCO.
[30] Christian Igel,et al. Efficient covariance matrix update for variable metric evolution strategies , 2009, Machine Learning.
[31] Xiaodong Li,et al. Benchmark Functions for the CEC'2010 Special Session and Competition on Large-Scale , 2009 .
[32] Ilya Loshchilov,et al. LM-CMA: An Alternative to L-BFGS for Large-Scale Black Box Optimization , 2015, Evolutionary Computation.
[33] Oswin Krause,et al. A More Efficient Rank-one Covariance Matrix Update for Evolution Strategies , 2015, FOGA.
[34] Andreas Zell,et al. Main vector adaptation: a CMA variant with linear time and space complexity , 2001 .
[35] Yun Shang,et al. A Note on the Extended Rosenbrock Function , 2006 .
[36] Tom Schaul,et al. Natural Evolution Strategies , 2008, 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence).
[37] Raymond Ros,et al. A Simple Modification in CMA-ES Achieving Linear Time and Space Complexity , 2008, PPSN.
[38] Xiaodong Li,et al. A Competitive Divide-and-Conquer Algorithm for Unconstrained Large-Scale Black-Box Optimization , 2016, ACM Trans. Math. Softw..
[39] Hans-Georg Beyer,et al. Convergence Analysis of Evolutionary Algorithms That Are Based on the Paradigm of Information Geometry , 2014, Evolutionary Computation.
[40] Anne Auger,et al. Information-Geometric Optimization Algorithms: A Unifying Picture via Invariance Principles , 2011, J. Mach. Learn. Res..
[41] Bernhard Sendhoff,et al. Covariance Matrix Adaptation Revisited - The CMSA Evolution Strategy - , 2008, PPSN.
[42] Tom Schaul,et al. Exponential natural evolution strategies , 2010, GECCO '10.
[43] James N. Knight,et al. Reducing the space-time complexity of the CMA-ES , 2007, GECCO '07.
[44] Youhei Akimoto,et al. Projection-Based Restricted Covariance Matrix Adaptation for High Dimension , 2016, GECCO.