Oxford University Press, 1996, 319 p.
Evolutionary Aigoritiims (EAs) are a class of direcr, profaafaiiistic search and optimization algorithms gleaned from the model of organic evolution. The madn representatives of this computational paradiga, Genetic Algorithms (GAs), Evolution Strategies (ESs), and Evolutionary Programming (EP), which were developed independently of each other, are presented in this work as instances of a generalized Evolutionary Algorithm. Based on this generalization and a formal framework for Evolutionary Algorithms, a detailed comparison of these three instances with respect to their operators, working principles, and existing theoretical background is performed. Some new theoretical results concerning recombination in Evolution Strategies, the convergence velocity and selection algorithm of Evolutionary Programming, and convergence properties of Genetic Algorithms are presented.
Besides the algorithmic aspect, the Evolutionary Algorithms are also compared experimentally by running them on several artificial parameter optimization problems (sphere model, step function, a generalized function after Ackley, a function after Fletcher and Powell, and a fractal function). On these problems, both concerning convergence velocity and convergence reliability an Evolution Strategy outperforms Evolutionary Programming, which is still slightly better than a Genetic Algorithm.
The second part of the thesis puts special emphasis on analyzing the behavior of simple Genetic Algorithms that work by mutation and (extinctive) selection. For such a (m+,l)-GA, a general convergence velocity theory is presented on the basis of order statistics and transition probabilities under mutation for the corresponding Markov chain. Closed expressions for these transition probabilities are presented for a particular objective function called "counting ones". Using these transition probabilities, the convergence velocity and optimal mutation rate for a (l+,l)-GA are calculated numerically. It turns out that the optimal mutation rate depends mainly on 1/l (the reciprocal of the search space dimension) and on the distance to the optimum as well as the selective pressure. As l increases (i.e., as selective pressure increases), both the Optimal mutation rate and convergence velocity increase.
The informal notion of selective pressure admits a quantification by using takeover times (following earlier work by Goldberg) and selection the order proportional selection, linear ranking, tournament selection, (m,l)-selection, (m+l)-selection. As clarified by a taxonomy and detailed comparison of selection methods, these five mechanisms represent cill principal possibilities to perform selection in an Evolutionary Algorithm.
In addition to the counting ones problem, optimal mutation rates are also analjrzed for a (l+l)-GA applied to a simple continuous parameter optimization problem. These investigations clearly demonstrate the superiority of a Gray code in comparison to the standard binary code, since the former does not create local optima at the coding level by itself as it is likely to happen when the latter is used. Concerning the optimal mutation rate, however, both codes yield highly irregular schedules that prevent one from drawing any conclusion towards a heuristic for an optimal mutation rate control.
The results concerning optimal mutation rates and selective pressure are confirmed by a parallel meta-evolution experiment. The meta-algorithm, a hybrid of Genetic Algorithm and Evolution Strategy, evolves generadized Genetic Algorithms for an optimal convergence velocity. The results of this experiment confirm that the optimal mutation rate and convergence velocity increase as selective pressure increases. On the other hand, the met a-algorithm clarifies that a combination of techniques from Evolution Strategies and Genetic Algorithms can be used to develop algorithms for handling nonlinear mixed-integer optimization problems.
I A Comparison of Evolutionary Algorithms.
Organic Evolution and Problem-Solving.
Biological Background.
Life and Information Processing.
Meiotic Heredity.
Mutations.
Molecular Dtirwinism.
Evolutionary Algoritluns and Artificial Intelligence.
Evolutionary Algorithms and Global Optimization.
Some Traditional Methods.
Computationcd Complexity of Global Optimization.
Early Approaches.
Specific Evolutionary Algorithms.
Evolution Strategies.
Representation and Fitness Evaluation.
Mutation.
Recombination.
Selection.
Other Components.
Conceptual Standart Algorithm.
Theory.
Evolutionary Programming.
Representation and Fitness Evaluation.
Mutation.
Recombination.
Selection.
Other Components.
Conceptual Standard Algorithm.
Theory.
Genetic Algorithms.
Representation and Fitness Evaluation.
Mutation.
Recombination.
Selection.
Other Components.
Conceptual Standard Algorithm.
Theory.
Artificiail Landscapes.
Sphere Model.
Step Function.
Ackley's Function.
Function after Fletcher and Powell.
Fractal Function.
An Empiriccd Comparison.
Convergence Velocity: f1, f2.
Convergence Reliability: f3, f4 and f5.
II Extending Genetic Algorithms.
Selection.
Selection Mechanisms.
Proportional Selection.
Ranking.
Tournament Selection.
(m+l)- and (m,l)-Selection.
Comparison of Takeover Times.
A Taxonomy of Selection Mechanisms.
Experimental Investigation of Selection.
Clear Improvement of Average Solutions: f1, f2.
Ambiguous Results: f3, f4, f5.
A Note on Scaling.
Mutation.
Simplified Genetic Algorithms.
The Counting Ones Problem.
Reflections on'Convergence Velocity.
The Role of the Binary Code.
An Experiment in Meta-Evolution.
Parallel Evolutionary Algorithms.
The Algorithm.
Evolving Convergence Velocity.
Summary and Outlook.
A Data for the Fletcher-Powell Function.
B Data from Selection Experiments.
C Software.
C.1 Overview.
C.2 Usage.
C.2.1 The Graphical User Interface Evos 1.0
C.2.2 Stand-alone Usage.
C.2.3 Visualization of Runs.
C.3 Data Collection.
D The Multiprocessor Environment.
D.l The Transputer System.
D.2 The Helios Operating System.
E Mathematical Symbols.