Generalized jump functions
暂无分享,去创建一个
[1] Carsten Witt,et al. Self-Adjusting Evolutionary Algorithms for Multimodal Optimization , 2020, Algorithmica.
[2] Benjamin Doerr,et al. A rigorous runtime analysis of the 2-MMASib on jump functions: ant colony optimizers can cope well with local optima , 2021, GECCO.
[3] Lazy parameter tuning and control: choosing all parameters randomly from a power-law distribution , 2021, GECCO.
[4] C. Witt,et al. Stagnation detection in highly multimodal fitness landscapes , 2021, GECCO.
[5] Benjamin Doerr,et al. Lower bounds from fitness levels made easy , 2021, GECCO.
[6] C. Witt,et al. Stagnation Detection with Randomized Local Search* , 2021, Evolutionary Computation.
[7] Benjamin Doerr,et al. Theoretical Analyses of Multiobjective Evolutionary Algorithms on Multimodal Objectives. , 2020, Evolutionary computation.
[8] Benjamin Doerr,et al. The Univariate Marginal Distribution Algorithm Copes Well with Deception and Epistasis , 2020, Evolutionary Computation.
[9] Markus Wagner,et al. Evolutionary algorithms and submodular functions: benefits of heavy-tailed mutations , 2018, Natural Computing.
[10] Benjamin Doerr,et al. First Steps Towards a Runtime Analysis When Starting With a Good Solution , 2020, PPSN.
[11] Benjamin Doerr,et al. Runtime Analysis of a Heavy-Tailed (1+(λ, λ)) Genetic Algorithm on Jump Functions , 2020, PPSN.
[12] Benjamin Doerr,et al. The (1 + (λ,λ)) GA is even faster on multimodal problems , 2020, GECCO.
[13] Benjamin Doerr,et al. Fixed-Target Runtime Analysis , 2020, Algorithmica.
[14] Benjamin Doerr. Does Comma Selection Help to Cope with Local Optima? , 2020, GECCO.
[15] Thomas Bäck,et al. Theory of Evolutionary Computation: Recent Developments in Discrete Optimization , 2020, Theory of Evolutionary Computation.
[16] Aishwaryaprajna,et al. The benefits and limitations of voting mechanisms in evolutionary optimisation , 2019, FOGA '19.
[17] Per Kristian Lehre,et al. On the limitations of the univariate marginal distribution algorithm to deception and where bivariate EDAs might help , 2019, FOGA '19.
[18] Andrei Lissovoi,et al. On the Time Complexity of Algorithm Selection Hyper-Heuristics for Multimodal Optimisation , 2019, AAAI.
[19] Benjamin Doerr,et al. Analyzing randomized search heuristics via stochastic domination , 2019, Theor. Comput. Sci..
[20] Benjamin Doerr,et al. A tight runtime analysis for the cGA on jump functions: EDAs can cross fitness valleys at no extra cost , 2019, GECCO.
[21] Markus Wagner,et al. Heavy-Tailed Mutation Operators in Single-Objective Combinatorial Optimization , 2018, PPSN.
[22] Chao Qian,et al. Dynamic Mutation Based Pareto Optimization for Subset Selection , 2018, ICIC.
[23] Markus Wagner,et al. Escaping large deceptive basins of attraction with heavy-tailed mutation operators , 2018, GECCO.
[24] Andrew M. Sutton,et al. On the runtime dynamics of the compact genetic algorithm on jump functions , 2018, GECCO.
[25] Pietro Simone Oliveto,et al. Fast Artificial Immune Systems , 2018, PPSN.
[26] Per Kristian Lehre,et al. Escaping Local Optima Using Crossover With Emergent Diversity , 2018, IEEE Transactions on Evolutionary Computation.
[27] Dogan Corus,et al. Standard Steady State Genetic Algorithms Can Hillclimb Faster Than Mutation-Only Evolutionary Algorithms , 2017, IEEE Transactions on Evolutionary Computation.
[28] Benjamin Doerr,et al. Static and Self-Adjusting Mutation Strengths for Multi-valued Decision Variables , 2018, Algorithmica.
[29] Pietro Simone Oliveto,et al. How to Escape Local Optima in Black Box Optimisation: When Non-elitism Outperforms Elitism , 2017, Algorithmica.
[30] Pietro Simone Oliveto,et al. On the runtime analysis of the opt-IA artificial immune system , 2017, GECCO.
[31] Benjamin Doerr,et al. Fast genetic algorithms , 2017, GECCO.
[32] Dirk Sudholt,et al. How Crossover Speeds up Building Block Assembly in Genetic Algorithms , 2014, Evolutionary Computation.
[33] Frank Neumann,et al. Fast Building Block Assembly by Majority Vote Crossover , 2016, GECCO.
[34] Duc-Cuong Dang,et al. Escaping Local Optima with Diversity Mechanisms and Crossover , 2016, GECCO.
[35] Dirk Sudholt,et al. Towards a Runtime Comparison of Natural and Artificial Evolution , 2015, Algorithmica.
[36] Solving Problems with Unknown Solution Length at (Almost) No Extra Cost , 2015, GECCO.
[37] Benjamin Doerr,et al. From black-box complexity to designing new genetic algorithms , 2015, Theor. Comput. Sci..
[38] Carsten Witt,et al. Tight Bounds on the Optimization Time of a Randomized Search Heuristic on Linear Functions† , 2013, Combinatorics, Probability and Computing.
[39] Dirk Sudholt,et al. A New Method for Lower Bounds on the Running Time of Evolutionary Algorithms , 2011, IEEE Transactions on Evolutionary Computation.
[40] Frank Neumann,et al. Bioinspired computation in combinatorial optimization: algorithms and their computational complexity , 2010, GECCO '12.
[41] Thomas Jansen,et al. Analyzing Evolutionary Algorithms: The Computer Science Perspective , 2012 .
[42] Per Kristian Lehre,et al. Negative Drift in Populations , 2010, PPSN.
[43] Benjamin Doerr,et al. Crossover can provably be useful in evolutionary computation , 2008, GECCO '08.
[44] Jens Jägersküpper,et al. When the Plus Strategy Outperforms the Comma Strategyand When Not , 2007, 2007 IEEE Symposium on Foundations of Computational Intelligence.
[45] Dirk Sudholt,et al. Crossover is provably essential for the Ising model on trees , 2005, GECCO '05.
[46] Ingo Wegener,et al. The Ising Model on the Ring: Mutation Versus Recombination , 2004, GECCO.
[47] Thomas Jansen,et al. The Analysis of Evolutionary Algorithms—A Proof That Crossover Really Can Help , 2002, Algorithmica.
[48] Thomas Jansen,et al. On the analysis of the (1+1) evolutionary algorithm , 2002, Theor. Comput. Sci..
[49] Ingo Wegener,et al. Theoretical Aspects of Evolutionary Algorithms , 2001, ICALP.