Runtime analysis for continuous (1+1) evolutionary algorithm based on average gain model

Runtime analysis of continuous evolutionary algorithm (EA) is an open problem in theoretical foundation of evolutionary computation. There are fewer results about it than the runtime studies of discrete EA. For an example of (1+1)EA, an average gain model and its calculating method were proposed to produce a theory of runtime analysis as an index of computational time complexity. The average gain was computed to estimate the average runtime of two (1+1)EAs based on the mutation of standard normal distribution and uniform distribution, for Sphere function which is focused on by many researchers. The analysis result indicates that computational time complexity of the (1+1)EAs is exponential order. Furthermore, the solution speed of uniform-distribution mutation is faster than standard normal distribution with the same error accuracy and initial distance. Numerical results also verify the correctness of the proposed theory and the usefulness of the average gain model.