# Genetic Algorithm by Sivanathan and Deepa

Introduction to Genetic Algorithms S. N. Sivanandam · S. N. Deepa Introduction to Genetic Algorithms With 193 Figures and 13 Tables Authors S. N. Sivanandam Professor and Head Dept. of Computer Science and Engineering PSG College of Technology Coimbatore – 641 004 TN, India S. N. Deepa Ph. D Scholar Dept. of Computer Science and Engineering PSG College of Technology Coimbatore – 641 004 TN, India Library of Congress Control Number: 2007930221 ISBN 978-3-540-73189-4 Springer Berlin Heidelberg New York This work is subject to copyright.

All rights are reserved, whether the whole or part of the material is concerned, speci? cally the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on micro? lm or in any other way, and storage in data banks. Duplication of this publication or parts thereof is permitted only under the provisions of the German Copyright Law of September 9, 1965, in its current version, and permission for use must always be obtained from Springer. Violations are liable for prosecution under the German Copyright Law. Springer is a part of Springer Science+Business Media springer. om c Springer-Verlag Berlin Heidelberg 2008 The use of general descriptive names, registered names, trademarks, etc. in this publication does not imply, even in the absence of a speci? c statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. Typesetting: Integra Software Services Pvt. Ltd. , India Cover design: Erich Kirchner, Heidelberg Printed on acid-free paper SPIN: 12053230 89/3180/Integra 5 4 3 2 1 0 Preface The origin of evolutionary algorithms was an attempt to mimic some of the processes taking place in natural evolution.

Although the details of biological evolution are not completely understood (even nowadays), there exist some points supported by strong experimental evidence: • Evolution is a process operating over chromosomes rather than over organisms. The former are organic tools encoding the structure of a living being, i. e. , a creature is “built” decoding a set of chromosomes. • Natural selection is the mechanism that relates chromosomes with the ef? ciency of the entity they represent, thus allowing that ef? cient organism which is welladapted to the environment to reproduce more often than those which are not. The evolutionary process takes place during the reproduction stage. There exists a large number of reproductive mechanisms in Nature. Most common ones are mutation (that causes the chromosomes of offspring to be different to those of the parents) and recombination (that combines the chromosomes of the parents to produce the offspring). Based upon the features above, the three mentioned models of evolutionary computing were independently (and almost simultaneously) developed. An Evolutionary Algorithm (EA) is an iterative and stochastic process that operates on a set of individuals (population).

Each individual represents a potential solution to the problem being solved. This solution is obtained by means of a encoding/decoding mechanism. Initially, the population is randomly generated (perhaps with the help of a construction heuristic). Every individual in the population is assigned, by means of a ? tness function, a measure of its goodness with respect to the problem under consideration. This value is the quantitative information the algorithm uses to guide the search. Among the evolutionary techniques, the genetic algorithms (GAs) are the most extended group of methods representing the application of evolutionary tools.

They rely on the use of a selection, crossover and mutation operators. Replacement is usually by generations of new individuals. Intuitively a GA proceeds by creating successive generations of better and better individuals by applying very simple operations. The search is only guided by the ? tness value associated to every individual in the population. This value is used to rank individuals depending on their relative suitability for the problem being v vi Preface solved. The problem is the ? tness function that for every individual is encharged of assigning the ? tness value.

The location of this kind of techniques with respect to other deterministic and non-deterministic procedures is shown in the following tree. This ? gure below outlines the situation of natural techniques among other well-known search procedures. Combinations of EAs with Hill-Climbing algorithms are very powerful. Genetic algorithms intensively using such local search mechanism are termed Memetic Algorithms. Also parallel models increase the extension and quality of the search. The EAs exploration compares quite well against the rest of search techniques for a similar search effort.

Exploitation is a more dif? cult goal in EAs but nowadays many solutions exist for EAs to re? ne solutions. Genetic algorithms are currently the most prominent and widely used computational models of evolution in arti? cial-life systems. These decentralized models provide a basis for understanding many other systems and phenomena in the world. Researches on GAs in alife give illustrative examples in which the genetic algorithm is used to study how learning and evolution interact, and to model ecosystems, immune system, cognitive systems, and social systems. About the Book

This book is meant for a wide range of readers, who wishes to learn the basic concepts of Genetic Algorithms. It can also be meant for programmers, researchers and management experts whose work is based on optimization techniques. The basic concepts of Genetic Algorithms are dealt in detail with the relevant information and knowledge available for understanding the optimization process. The various operators involved for Genetic Algorithm operation are explained with examples. The advanced operators and the various classi? cations have been discussed in lucid manner, so that a starter can understand the concepts with a minimal effort.

The solutions to speci? c problems are solved using MATLAB 7. 0 and the solutions are given. The MATLAB GA toolbox has also been included for easy reference of the readers so that they can have hands on working with various GA functions. Apart from MATLAB solutions, certain problems are also solved using C and C++ and the solutions are given. The book is designed to give a broad in-depth knowledge on Genetic Algorithm. This book can be used as a handbook and a guide for students of all engineering disciplines, management sector, operational research area, computer applications, and for various professionals who work in Optimization area.

The steps involved and characteristics of genetic programming with its applications are described here. • Chapter 7 discusses on various genetic algorithm optimization problems which includes fuzzy optimization, multi objective optimization, combinatorial optimization, scheduling problems and so on. • The implementation of genetic algorithm using MATLAB is discussed in Chap. 8. The toolbox functions and simulated results to speci? c problems are provided in this chapter. • Chapter 9 gives the implementation of genetic algorithm concept using C and C++. The implementation is performed for few benchmark problems. The application of genetic algorithm in various emerging ? elds along with case studies is given in Chapter 10. • Chapter 11 gives a brief introduction to particle swarm optimization and ant colony optimization. The Bibliography is given at the end for the ready reference of readers. 2 Salient Features of the Book The salient features of the book include: • • • • • • • Detailed explanation of Genetic Algorithm concepts Numerous Genetic Algorithm Optimization Problems Study on various types of Genetic Algorithms Implementation of Optimization problem using C and C++ Simulated solutions for Genetic Algorithm problems using MATLAB 7. Brief description on the basics of Genetic Programming Application case studies on Genetic Algorithm on emerging ? elds S. N. Sivanandam completed his B. E (Electrical and Electronics Engineering) in 1964 from Government College of Technology, Coimbatore and M. Sc (Engineering) About the Book ix in Power System in 1966 from PSG College of Technology, Coimbatore. He acquired PhD in Control Systems in 1982 from Madras University. He has received Best Teacher Award in the year 2001 and Dhakshina Murthy Award for Teaching Excellence from PSG College of Technology.

He received The CITATION for best teaching and technical contribution in the Year 2002, Government College of Technology, Coimbatore. He has a total teaching experience (UG and PG) of 41 years. The total number of undergraduate and postgraduate projects guided by him for both Computer Science and Engineering and Electrical and Electronics Engineering is around 600. He is currently working as a Professor and Head Computer Science and Engineering Department, PSG College of Technology, Coimbatore [from June 2000]. He has been identi? d as an outstanding person in the ? eld of Computer Science and Engineering in MARQUIS “Who’s Who”, October 2003 issue, New providence, New Jersey, USA. He has also been identi? ed as an outstanding person in the ? eld of Computational Science and Engineering in “Who’s Who”, December 2005 issue, Saxe-Coburg Publications, United Kingdom. He has been placed as a VIP member in the continental WHO’s WHO Registry of national Business Leaders, Inc. 33 West Hawthorne Avenue Valley Stream, NY 11580, Aug 24, 2006. S. N. Sivanandam has published 12 books.

He has delivered around 150 special lectures of different specialization in Summer/Winter school and also in various Engineering colleges. He has guided and coguided 30 Ph. D research works and at present 9 Ph. D research scholars are working under him. The total number of technical publications in International/National journals/Conferences is around 700. He has also received Certi? cate of Merit 2005–2006 for his paper from The Institution of Engineers (India). He has chaired 7 International conferences and 30 National conferences. He is a member of various professional bodies like IE (India), ISTE, CSI, ACS and SSI.

He is a technical advisor for various reputed industries and Engineering Institutions. His research areas include Modeling and Simulation, Neural networks , Fuzzy Systems and Genetic Algorithm, Pattern Recognition, Multi dimensional system analysis, Linear and Non linear control system, Signal and Image processing, Control System, Power system, Numerical methods, Parallel Computing, Data Mining and Database Security. S. N. Deepa has completed her B. E Degree from Government College of Technology, Coimbatore, 1999 and M. E Degree from PSG College of Technology, Coimbatore, 2004. She was a gold medalist in her B. E Degree Programme. She as received G. D Memorial Award in the year 1997 and Best Outgoing Student Award from PSG College of Technology, 2004. Her M. E Thesis won National Award from the Indian Society of Technical Education and L&T, 2004. She has published 5 books and papers in International and National Journals. Her research areas include Neural Network, Fuzzy Logic, Genetic Algorithm, Digital Control, Adaptive and Non-linear Control. Acknowledgement The authors are always thankful to the Almighty for perseverance and achievements. They wish to thank Shri G. Rangaswamy, Managing Trustee, PSG Institutions, Shri C. R. Swaminathan, Chief Executive; and Dr.

R. Rudramoorthy, Principal, PSG College of Technology, Coimbatore, for their whole-hearted cooperation and great encouragement given in this successful endeavor. They also wish to thank the staff members of computer science and engineering for their cooperation. Deepa wishes to thank her husband Anand, daughter Nivethitha and parents for their support. xi Contents 1 Evolutionary Computation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. 2 The Historical Development of EC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. 2. 1 Genetic Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. 2. 2 Genetic Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. 2. 3 Evolutionary Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. 2. 4 Evolutionary Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. 3 Features of Evolutionary Computation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. 3. 1 Particulate Genes and Population Genetics . . . . . . . . . . . . . . . . . . 1. . 2 The Adaptive Code Book . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. 3. 3 The Genotype/Phenotype Dichotomy . . . . . . . . . . . . . . . . . . . . . . . 1. 4 Advantages of Evolutionary Computation . . . . . . . . . . . . . . . . . . . . . . . . . 1. 4. 1 Conceptual Simplicity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. 4. 2 Broad Applicability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. 4. 3 Hybridization with Other Methods . . . . . . . . . . . . . . . . . . . . . . . . . 1. 4. 4 Parallelism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. 4. 5 Robust to Dynamic Changes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. 4. 6 Solves Problems that have no Solutions . . . . . . . . . . . . . . . . . . . . . 1. 5 Applications of Evolutionary Computation . . . . . . . . . . . . . . . . . . . . . . . . . 1. 6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Genetic Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 2 Biological Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 2. 1 The Cell . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 2. 2 Chromosomes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 2. 3 Genetics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 2. 4 Reproduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 2. 5 Natural Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 3 What is Genetic Algorithm? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 3. 1 Search Space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 3. 2 Genetic Algorithms World . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 3. 3 Evolution and Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 3. 4 Evolution and Genetic Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . 1 1 2 2 3 4 5 5 6 7 8 9 10 10 11 11 11 12 12 13 15 15 16 16 16 17 17 19 20 20 20 22 23 xiv

Contents 2. 4 2. 5 2. 6 2. 7 2. 8 2. 9 Conventional Optimization and Search Techniques . . . . . . . . . . . . . . . . . . 2. 4. 1 Gradient-Based Local Optimization Method . . . . . . . . . . . . . . . . . 2. 4. 2 Random Search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 4. 3 Stochastic Hill Climbing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 4. 4 Simulated Annealing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. 4. 5 Symbolic Arti? cial Intelligence (AI) . . . . . . . . . . . . . . . . . . . . . . . A Simple Genetic Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Comparison of Genetic Algorithm with Other Optimization Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Advantages and Limitations of Genetic Algorithm . . . . . . . . . . . . . . . . . . Applications of Genetic Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 25 26 27 27 29 29 33 34 35 36 39 39 39 39 40 41 41 42 43 43 43 44 44 44 45 45 46 46 50 56 57 59 59 60 60 60 60 61 62 62 63 65 Terminologies and Operators of GA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 2 Key Elements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 3 Individuals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 4 Genes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 5 Fitness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 6 Populations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 7 Data Structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 8 Search Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 9 Encoding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 9. 1 Binary Encoding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. . 2 Octal Encoding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 9. 3 Hexadecimal Encoding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 9. 4 Permutation Encoding (Real Number Coding) . . . . . . . . . . . . . . . 3. 9. 5 Value Encoding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 9. 6 Tree Encoding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 10 Breeding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 10. 1 Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 10. 2 Crossover (Recombination) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 10. 3 Mutation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 10. 4 Replacement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 11 Search Termination (Convergence Criteria) . . . . . . . . . . . . . . . . . . . . . . . . 3. 11. 1 Best Individual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 11. 2 Worst individual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 11. 3 Sum of Fitness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 11. 4 Median Fitness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 12 Why do Genetic Algorithms Work? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 12. 1 Building Block Hypothesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 12. 2 A Macro-Mutation Hypothesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 12. 3 An Adaptive Mutation Hypothesis . . . . . . . . . . . . . . . . . . . . . . . . 3. 12. 4 The Schema Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 12. 5 Optimal Allocation of Trials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Contents xv 3. 13 3. 14 3. 15 3. 16 3. 17 3. 18 3. 12. 6 Implicit Parallelism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 12. 7 The No Free Lunch Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Solution Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Search Re? nement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Fitness Scaling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 16. 1 Linear Scaling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 16. 2 Sigma Truncation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 16. 3 Power Law Scaling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Example Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 17. 1 Maximizing a Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. 17. 2 Traveling Salesman Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Exercise Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 68 68 69 69 70 70 71 72 72 72 76 78 81 4 Advanced Operators and Techniques in Genetic Algorithm . . . . . . . . . 83 4. 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 4. 2 Diploidy, Dominance and Abeyance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 4. 3 Multiploid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 4. 4 Inversion and Reordering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86 4. 4. 1 Partially Matched Crossover (PMX) . . . . . . . . . . . . . . . . . . . . . . . . 88 4. 4. 2 Order Crossover (OX) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 4. 4. 3 Cycle Crossover (CX) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 4. 5 Niche and Speciation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 4. 5. 1 Niche and Speciation in Multimodal Problems . . . . . . . . . . . . . . . 90 4. 5. 2 Niche and Speciation in Unimodal Problems . . . . . . . . . . . . . . . . . 93 4. 5. 3 Restricted Mating . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 4. 6 Few Micro-operators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 4. 6. 1 Segregation and Translocation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 4. 6. 2 Duplication and Deletion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 4. 6. 3 Sexual Determination . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 4. 7 Non-binary Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 4. 8 Multi-Objective Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99 4. 9 Combinatorial Optimizations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 4. 10 Knowledge Based Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 4. 11 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 Exercise Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 5 Classi? cation of Genetic Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 5. 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 5. 2 Simple Genetic Algorithm (SGA) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 5. 3 Parallel and Distributed Genetic Algorithm (PGA and DGA) . . . . . . . . . 106 5. 3. 1 Master-Slave Parallelization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 5. 3. 2 Fine Grained Parallel GAs (Cellular GAs) . . . . . . . . . . . . . . . . . . . 110 5. 3. 3 Multiple-Deme Parallel GAs (Distributed GAs or Coarse Grained GAs) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 xvi Contents 5. 4 5. 5 5. 6 5. 7 5. 8 5. 3. 4 Hierarchical Parallel Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . 113 Hybrid Genetic Algorithm (HGA) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115 5. 4. 1 Crossover . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 5. 4. 2 Initialization Heuristics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 5. 4. 3 The RemoveSharp Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 5. 4. 4 The LocalOpt Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 Adaptive Genetic Algorithm (AGA) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 5. . 1 Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 5. 5. 2 Evaluation Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 5. 5. 3 Selection operator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121 5. 5. 4 Crossover operator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121 5. 5. 5 Mutation operator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122 Fast Messy Genetic Algorithm (FmGA) . . . . . . . . . . . . . . . . . . . . . . . . . . 122 5. 6. 1 Competitive Template (CT) Generation . . . . . . . . . . . . . . . . . . . . . 123 Independent Sampling Genetic Algorithm (ISGA) . . . . . . . . . . . . . . . . . . 124 5. 7. 1 Independent Sampling Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125 5. 7. 2 Breeding Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127 Exercise Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129 Genetic Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131 6. 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131 6. 2 Comparison of GP with Other Approaches . . . . . . . . . . . . . . . . . . . . . . . . . 131 6. 3 Primitives of Genetic Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135 6. 3. 1 Genetic Operators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136 6. 3. 2 Generational Genetic Programming . . . . . . . . . . . . . . . . . . . . . . . 136 6. 3. 3 Tree Based Genetic Programming . . . . . . . . . . . . . . . . . . . . . . . . . . 136 6. 3. 4 Representation of Genetic Programming . . . . . . . . . . . . . . . . . . . . 137 6. 4 Attributes in Genetic Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141 6. 5 Steps of Genetic Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143 6. 5. 1 Preparatory Steps of Genetic Programming . . . . . . . . . . . . . . . . . . 143 6. 5. 2 Executional Steps of Genetic Programming . . . . . . . . . . . . . . . . . . 146 6. 6 Characteristics of Genetic Programming . . . . . . . . . . . . . . . . . . . . . . . . . . 149 6. 6. 1 What We Mean by “Human-Competitive” . . . . . . . . . . . . . . . . . . . 149 6. 6. 2 What We Mean by “High-Return” . . . . . . . . . . . . . . . . . . . . . . . . . 152 6. 6. 3 What We Mean by “Routine” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154 6. 6. 4 What We Mean by “Machine Intelligence” . . . . . . . . . . . . . . . . . . 154 6. 7 Applications of Genetic Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156 6. 7. 1 Applications of Genetic Programming in Civil Engineering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156 6. 8 Haploid Genetic Programming with Dominance . . . . . . . . . . . . . . . . . . . . 159 6. 8. 1 Single-Node Dominance Crossover . . . . . . . . . . . . . . . . . . . . . . . . 161 6. 8. 2 Sub-Tree Dominance Crossover . . . . . . . . . . . . . . . . . . . . . . . . . . . 161 6. 9 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161 Exercise Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163 Contents xvii 7 Genetic Algorithm Optimization Problems . . . . . . . . . . . . . . . . . . . . . . . . 165 7. 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165 7. 2 Fuzzy Optimization Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165 7. 2. 1 Fuzzy Multiobjective Optimization . . . . . . . . . . . . . . . . . . . . . . . . . 166 7. 2. 2 Interactive Fuzzy Optimization Method . . . . . . . . . . . . . . . . . . . . . 168 7. 2. 3 Genetic Fuzzy Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168 7. 3 Multiobjective Reliability Design Problem . . . . . . . . . . . . . . . . . . . . . . . . 170 7. 3. 1 Network Reliability Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170 7. 3. 2 Bicriteria Reliability Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174 7. 4 Combinatorial Optimization Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176 7. 4. 1 Linear Integer Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 7. 4. 2 Applications of Combinatorial Optimization . . . . . . . . . . . . . . . . . 179 7. 4. 3 Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182 7. 5 Scheduling Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187 7. 5. 1 Genetic Algorithm for Job Shop Scheduling Problems (JSSP) . . 187 7. 6 Transportation Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190 7. 6. 1 Genetic Algorithm in Solving Transportation Location-Allocation Problems with Euclidean Distances . . . . . . . 191 7. 6. 2 Real-Coded Genetic Algorithm (RCGA) for Integer Linear Programming in Production-Transportation Problems with Flexible Transportation Cost . . . . . . . . . . . . . . . . . . . . . . . . . 194 7. 7 Network Design and Routing Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . 199 7. 7. 1 Planning of Passive Optical Networks . . . . . . . . . . . . . . . . . . . . . . 199 7. 7. 2 Planning of Packet Switched Networks . . . . . . . . . . . . . . . . . . . . . 202 7. 7. 3 Optimal Topological Design of All Terminal Networks . . . . . . . . 203 7. 8 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208 Exercise Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 09 8 Genetic Algorithm Implementation Using Matlab . . . . . . . . . . . . . . . . . . . 211 8. 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211 8. 2 Data Structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211 8. 2. 1 Chromosomes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212 8. 2. 2 Phenotypes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212 8. 2. 3 Objective Function Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213 8. 2. 4 Fitness Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213 8. 2. 5 Multiple Subpopulations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213 8. 3 Toolbox Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 214 8. 4 Genetic Algorithm Graphical User Interface Toolbox . . . . . . . . . . . . . . . . 219 8. 5 Solved Problems using MATLAB . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224 8. 6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 260 Review Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261 Exercise Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261 9 Genetic Algorithm Optimization in C/C++ . . . . . . . . . . . . . . . . . . . . . . . . . 263 9. 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263 9. 2 Traveling Salesman Problem (TSP) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263 xviii Contents 9. 3 9. 4 9. 9. 6 9. 7 9. 8 9. 9 Word Matching Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 271 Prisoner’s Dilemma . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 280 Maximize f(x) = x2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 286 Minimization a Sine Function with Constraints . . . . . . . . . . . . . . . . . . . . . 292 9. 6. 1 Problem Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 293 Maximizing the Function f(x) = x? sin(10? ? ? x) + 10 . . . . . . . . . . . . . . 302 Quadratic Equation Solving . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 310 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315 9. 9. 1 Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315 10 Applications of Genetic Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317 10. 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317 10. 2 Mechanical Sector . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317 10. 2. 1 Optimizing Cyclic-Steam Oil Production with Genetic Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317 10. 2. 2 Genetic Programming and Genetic Algorithms for Auto-tuning Mobile Robot Motion Control . . . . . . . . . . . . . . . 320 10. 3 Electrical Engineering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324 10. 3. 1 Genetic Algorithms in Network Synthesis . . . . . . . . . . . . . . . . . . . 324 10. 3. 2 Genetic Algorithm Tools for Control Systems Engineering . . . . . 328 10. . 3 Genetic Algorithm Based Fuzzy Controller for Speed Control of Brushless DC Motor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 334 10. 4 Machine Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341 10. 4. 1 Feature Selection in Machine learning using GA . . . . . . . . . . . . . 341 10. 5 Civil Engineering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 345 10. 5. 1 Genetic Algorithm as Automatic Structural Design Tool . . . . . . . 345 10. 5. 2 Genetic Algorithm for Solving Site Layout Problem . . . . . . . . . 350 10. 6 Image Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 352 10. 6. 1 Designing Texture Filters with Genetic Algorithms . . . . . . . . . . . 352 10. 6. 2 Genetic Algorithm Based Knowledge Acquisition on Image Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357 10. 6. 3 Object Localization in Images Using Genetic Algorithm . . . . . . . 362 10. 6. 4 Problem Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 363 10. 6. 5 Image Preprocessing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364 10. 6. 6 The Proposed Genetic Algorithm Approach . . . . . . . . . . . . . . . . . 365 10. 7 Data Mining . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 367 10. 7. 1 A Genetic Algorithm for Feature Selection in Data-Mining . . . . 367 10. 7. 2 Genetic Algorithm Based Fuzzy Data Mining to Intrusion Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 370 10. 7. 3 Selection and Partitioning of Attributes in Large-Scale Data Mining Problems Using Genetic Algorithm . . . . . . . . . . . . . . . . . 379 10. Wireless Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 386 10. 8. 1 Genetic Algorithms for Topology Planning in Wireless Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 386 10. 8. 2 Genetic Algorithm for Wireless ATM Network . . . . . . . . . . . . . . . 387 10. 9 Very Large Scale Integration (VLSI) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395 Contents xix 10. 9. 1 Development of a Genetic Algorithm Technique for VLSI Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395 10. 9. VLSI Macro Cell Layout Using Hybrid GA . . . . . . . . . . . . . . . . . 397 10. 9. 3 Problem Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 398 10. 9. 4 Genetic Layout Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 399 10. 10 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 402 11 Introduction to Particle Swarm Optimization and Ant Colony Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403 11. 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403 11. 2 Particle Swarm Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403 11. 2. 1 Background of Particle Swarm Optimization . . . . . . . . . . . . . . . . 404 11. 2. 2 Operation of Particle Swarm Optimization . . . . . . . . . . . . . . . . . . 405 11. 2. 3 Basic Flow of Particle Swarm Optimization . . . . . . . . . . . . . . . . . 407 11. 2. 4 Comparison Between PSO and GA . . . . . . . . . . . . . . . . . . . . . . . . 408 11. 2. 5 Applications of PSO . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 410 11. 3 Ant Colony Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 410 11. 3. 1 Biological Inspiration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 410 11. 3. 2 Similarities and Differences Between Real Ants and Arti? cial Ants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 414 11. 3. 3 Characteristics of Ant Colony Optimization . . . . . . . . . . . . . . . . . 415 11. 3. 4 Ant Colony Optimization Algorithms . . . . . . . . . . . . . . . . . . . . . . . 416 11. 3. 5 Applications of Ant Colony Optimization . . . . . . . . . . . . . . . . . . 422 11. 4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 424 Exercise Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 424 Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 425 Chapter 1 Evolutionary Computation 1. 1 Introduction Charles Darwinian evolution in 1859 is intrinsically a so bust search and optimization mechanism. Darwin’s principle “Survival of the ? ttest” captured the popular imagination.

This principle can be used as a starting point in introducing evolutionary computation. Evolved biota demonstrates optimized complex behavior at each level: the cell, the organ, the individual and the population. Biological species have solved the problems of chaos, chance, nonlinear interactivities and temporality. These problems proved to be in equivalence with the classic methods of optimization. The evolutionary concept can be applied to problems where heuristic solutions are not present or which leads to unsatisfactory results. As a result, evolutionary algorithms are of recent interest, particularly for practical problems solving.

The theory of natural selection proposes that the plants and animals that exist today are the result of millions of years of adaptation to the demands of the environment. At any given time, a number of different organisms may co-exist and compete for the same resources in an ecosystem. The organisms that are most capable of acquiring resources and successfully procreating are the ones whose descendants will tend to be numerous in the future. Organisms that are less capable, for whatever reason, will tend to have few or no descendants in the future. The former are said to be more ? than the latter, and the distinguishing characteristics that caused the former to be ? t are said to be selected for over the characteristics of the latter. Over time, the entire population of the ecosystem is said to evolve to contain organisms that, on average, are more ? t than those of previous generations of the population because they exhibit more of those characteristics that tend to promote survival. Evolutionary computation (EC) techniques abstract these evolutionary principles into algorithms that may be used to search for optimal solutions to a problem.

In a search algorithm, a number of possible solutions to a problem are available and the task is to ? nd the best solution possible in a ? xed amount of time. For a search space with only a small number of possible solutions, all the solutions can be examined in a reasonable amount of time and the optimal one found. This exhaustive search, however, quickly becomes impractical as the search space grows in size. Traditional search algorithms randomly sample (e. g. , random walk) or heuristically sample (e. g. , gradient descent) the search space one solution at a time in the hopes 1 2 1 Evolutionary Computation of ? ding the optimal solution. The key aspect distinguishing an evolutionary search algorithm from such traditional algorithms is that it is population-based. Through the adaptation of successive generations of a large number of individuals, an evolutionary algorithm performs an ef? cient directed search. Evolutionary search is generally better than random search and is not susceptible to the hill-climbing behaviors of gradient-based search. Evolutionary computing began by lifting ideas from biological evolutionary theory into computer science, and continues to look toward new biological research ? ndings for inspiration.

However, an over enthusiastic “biology envy” can only be to the detriment of both disciplines by masking the broader potential for two-way intellectual traf? c of shared insights and analogizing from one another. Three fundamental features of biological evolution illustrate the range of potential intellectual ? ow between the two communities: particulate genes carry some subtle consequences for biological evolution that have not yet translated mainstream EC; the adaptive properties of the genetic code illustrate how both communities can contribute to a common understanding of appropriate evolutionary abstractions; ? ally, EC exploration of representational language seems pre-adapted to help biologists understand why life evolved a dichotomy of genotype and phenotype. 1. 2 The Historical Development of EC In the case of evolutionary computation, there are four historical paradigms that have served as the basis for much of the activity of the ? eld: genetic algorithms (Holland, 1975), genetic programming (Koza, 1992, 1994), evolutionary strategies (Recheuberg, 1973), and evolutionary programming (Forgel et al. , 1966).

The basic differences between the paradigms lie in the nature of the representation schemes, the reproduction operators and selection methods. 1. 2. 1 Genetic Algorithms The most popular technique in evolutionary computation research has been the genetic algorithm. In the traditional genetic algorithm, the representation used is a ? xed-length bit string. Each position in the string is assumed to represent a particular feature of an individual, and the value stored in that position represents how that feature is expressed in the solution.

Usually, the string is “evaluated as a collection of structural features of a solution that have little or no interactions”. The analogy may be drawn directly to genes in biological organisms. Each gene represents an entity that is structurally independent of other genes. The main reproduction operator used is bit-string crossover, in which two strings are used as parents and new individuals are formed by swapping a sub-sequence between the two strings (see Fig. 1. 1). Another popular operator is bit-? ipping mutation, in which a single bit in the string is ? ipped to form a new offspring string 1. The Historical Development of EC Fig. 1. 1 Bit-string crossover of parents a & b to form offspring c & d 3 (see Fig. 1. 2). A variety of other operators have also been developed, but are used less frequently (e. g. , inversion, in which a subsequence in the bit string is reversed). A primary distinction that may be made between the various operators is whether or not they introduce any new information into the population. Crossover, for example, does not while mutation does. All operators are also constrained to manipulate the string in a manner consistent with the structural interpretation of genes.

For example, two genes at the same location on two strings may be swapped between parents, but not combined based on their values. Traditionally, individuals are selected to be parents probabilistically based upon their ? tness values, and the offspring that are created replace the parents. For example, if N parents are selected, then N offspring are generated which replace the parents in the next generation. 1. 2. 2 Genetic Programming An increasingly popular technique is that of genetic programming. In a standard genetic program, the representation used is a variable-sized tree of functions and values.

Each leaf in the tree is a label from an available set of value labels. Each internal node in the tree is label from an available set of function labels. The entire tree corresponds to a single function that may be evaluated. Typically, the tree is evaluated in a leftmost depth-? rst manner. A leaf is evaluated as the corresponding value. A function is evaluated using arguments that is the result of the evaluation of its children. Genetic algorithms and genetic programming are similar in most other respects, except that the reproduction operators are tailored to a tree representation.

The most commonly used operator is subtree crossover, in which an entire subtree is swapped between two parents (see Fig. 1. 3). In a standard genetic program, all values and functions are assumed to return the same type, although functions may vary in the number of arguments they take. This closure principle (Koza, 1994) allows any subtree to be considered structurally on par with any other subtree, and ensures that operators such as sub-tree crossover will always produce legal offspring. Fig. 1. 2 Bit-? ipping mutation of parent a to form offspring b 4 1 Evolutionary Computation

Fig. 1. 3 Subtree crossover of parents a & b to form offspring c & d 1. 2. 3 Evolutionary Strategies In evolutionary strategies, the representation used is a ? xed-length real-valued vector. As with the bitstrings of genetic algorithms, each position in the vector corresponds to a feature of the individual. However, the features are considered to be behavioral rather than structural. “Consequently, arbitrary non-linear interactions between features during evaluation are expected which forces a more holistic approach to evolving solutions” (Angeline, 1996).

The main reproduction operator in evolutionary strategies is Gaussian mutation, in which a random value from a Gaussian distribution is added to each element of an individual’s vector to create a new offspring (see Fig. 1. 4). Another operator that is used is intermediate recombination, in which the vectors of two parents are averaged together, element by element, to form a new offspring (see Fig. 1. 5). The effects of these operators re? ect the behavioral as opposed to structural interpretation of the representation since knowledge of the values of vector elements is used to derive new vector elements.

The selection of parents to form offspring is less constrained than it is in genetic algorithms and genetic programming. For instance, due to the nature of the representation, it is easy to average vectors from many individuals to form a single offspring. In a typical evolutionary strategy, N parents are selected uniformly randomly Fig. 1. 4 Gaussian mutation of parent a to form offspring b 1. 3 Features of Evolutionary Computation Fig. 1. 5 Intermediate recombination of parents a & b to form offspring c 5 (i. e. , not based upon ? ness), more than N offspring are generated through the use of recombination, and then N survivors are selected deterministically. The survivors are chosen either from the best N offspring (i. e. , no parents survive) or from the best N parents and offspring. 1. 2. 4 Evolutionary Programming Evolutionary programming took the idea of representing individuals’ phenotypic ally as ? nite state machines capable of responding to environmental stimuli and developing operators for effecting structural and behavioral change over time.

This idea was applied to a wide range of problems including prediction problems, optimization and machine learning. The above characterizations, leads one to the following observations. GA practitioners are seldom constrained to ? xed-length binary implementations. GP enables the use of variable sized tree of functions and values. ES practitioners have incorporated recombination operators into their systems. EP is used for the evolution of ? nite state machines. The representations used in evolutionary programming are typically tailored to the problem domain. One representation commonly used is a ? ed-length realvalued vector. The primary difference between evolutionary programming and the previous approaches is that no exchange of material between individuals in the population is made. Thus, only mutation operators are used. For real-valued vector representations, evolutionary programming is very similar to evolutionary strategies without recombination. A typical selection method is to select all the individuals in the population to be the N parents, to mutate each parent to form N offspring, and to probabilistically select, based upon ? tness, N survivors from the total 2N individuals to form the next generation. . 3 Features of Evolutionary Computation In an evolutionary algorithm, a representation scheme is chosen by the researcher to de? ne the set of solutions that form the search space for the algorithm. A number of individual solutions are created to form an initial population. The following steps are then repeated iteratively until a solution has been found which satis? es a pre-de? ned termination criterion. Each individual is evaluated using a ? tness function that is speci? c to the problem being solved. Based upon their ? tness values, 6 1 Evolutionary Computation number of individuals are chosen to be parents. New individuals, or offspring, are produced from those parents using reproduction operators. The ?tness values of those offspring are determined. Finally, survivors are selected from the old population and the offspring to form the new population of the next generation. The mechanisms determining which and how many parents to select, how many offspring to create, and which individuals will survive into the next generation together represent a selection method. Many different selection methods have been proposed in the literature, and they vary in complexity.

Typically, though, most selection methods ensure that the population of each generation is the same size. EC techniques continue to grow in complexity and desirability, as biological research continues to change our perception of the evolutionary process. In this context, we introduce three fundamental features of biological evolution: 1. particulate genes and population genetics 2. the adaptive genetic code 3. the dichotomy of genotype and phenotype Each phenomenon is chosen to represent a different point in the spectrum of possible relationships between computing and biological evolutionary theory.

The ? rst is chosen to ask whether current EC has fully transferred the basics of biological evolution. The second demonstrates how both biological and computational evolutionary theorists can contribute to common understanding of evolutionary abstractions. The third is chosen to illustrate a question of biological evolution that EC seems better suited to tackle than biology. 1. 3. 1 Particulate Genes and Population Genetics Mainstream thinking of the time viewed the genetic essence of phenotype as a liquid that blended whenever male and female met to reproduce. It took the world’s ? st professor of engineering, Fleming Jenkin (1867), to point out the mathematical consequence of blending inheritance: a novel advantageous mutation arising in a sexually reproducing organism would dilute itself out of existence during the early stages of its spread through any population comprising more than a few individuals. This is a simple consequence of biparental inheritance. Mendels’ theory of particulate genes (Mendel, 1866) replaced this ? awed, analogue concept of blending inheritance with a digital system in which the advantageous version (allele) of a gene is either present or absent and biparental inheritance produces diploidy.

Thus natural selection merely alters the proportions of alleles in a population, and an advantageous mutation can be selected into ? xation (presence within 100% of individuals) without any loss in its ? tness. Though much has been written about the Neo-Darwinian Synthesis that ensured from combining Mendelian genetics with Darwinian theory, it largely amounts to biologists’ gradual acceptance that the particulate nature of genes alone provided a solid foundation to build detailed, quantitative predictions about evolution. 1. 3 Features of Evolutionary Computation 7

Indeed, decision of mathematical models of genes in populations as “bean bag genetics” overlooks the scope of logical deductions that follow from particulate genetics. They extend far beyond testable explanations for adaptive phenomena and into deeper, abstract concepts of biological evolution. For example, particulate genes introduce stochasticity into evolution. Because genes are either present or absent from any given genome, the genetic makeup of each new individual in a sexually reproducing population is a probabilistic outcome of which particular alleles it inherits from each parent.

Unless offspring are in? nite in number, their allele frequencies will not accurately mirror those of the parental generation, but instead will show some sampling error (genetic drift). The magnitude of this sampling error is inversely proportional to the size of a population. Wright (1932) noted that because real populations ? uctuate in size, temporary reductions can brie? y relax selection, potentially allowing gene pools to diffuse far enough away from local optima to ? nd new destinations when population size recovers and selection reasserts itself. In effect, particulate genes in ? ite populations improve the evolutionary heuristic from a simple hill climbing algorithm to something closer to simulated annealing under a ? uctuating temperature. One ? nal property of particulate genes operating in sexual populations is worthy of mention. In the large populations where natural selection works most effectively, any novel advantageous mutation that arises will only reach ? xation over the course of multiple generations. During this spread, recombination and diploidy together ensure that the allele will temporarily ? nd itself in many different genetic contexts. Classical population genetics (e. g. Fisher, 1930) and experimental EC systems (e. g. , O’Reilly, 1999) have focused on whether and how this context promotes selective pressure for gene linkage into “co-adapted gene complexes”. A simpler observation is that a novel, advantageous allele’s potential for negative epistatic effects is integral to its micro-evolutionary success. Probability will favor the ? xation of alleles that are good “team players” (i. e. , reliably imbue their advantage regardless of genetic background. Many mainstream EC methods simplify the population genetics of new mutations (e. g. , into tournaments), to expedite the adaptive process.

This preserves non-blending inheritance and even genetic drift, but it is not clear that it incorporates population genetics’ implicit ? lter for “prima donna” alleles that only offer their adaptive advantage when their genetic context is just so. Does this basic difference between biology and EC contribute anything to our understanding of why recombination seems to play such different roles in the two systems? 1. 3. 2 The Adaptive Code Book Molecular biology’s Central Dogma connects genes to phenotype by stating that DNA is transcribed into RNA, which is then translated into protein.

The terms transcription and translation are quite literal: RNA is a chemical sister language to DNA. Both are polymers formed from an alphabet of four chemical letters (nucleotides), and transcription is nothing more than a process of complementing DNA, letter by letter, into RNA. It is the next step, translation 8 1 Evolutionary Computation that profoundly in? uences biological evolution. Proteins are also linear polymers of chemical letters, but they are drawn from a qualitatively different alphabet (amino acids) comprising 20 elements.

Clearly no one-to-one mapping could produce a genetic code for translating nucleotides unambiguously into amino acids, and by 1966 it was known that the combinatorial set of possible nucleotide triplets forms a dictionary of “codons” that each translate into a single amino acid meaning. The initial surprise for evolutionary theory was to discover that something as fundamental as the code-book for life would exhibit a high degree of redundancy (an alphabet of 4 RNA letters permits 4? 4? 4 = 64 possible codons that map to one of only 20 amino acid meanings).

Early interpretation fuelled arguments for Non-Darwinian evolution: genetic variations that make no difference to the protein they encode must be invisible to selection and therefore governed solely by drift. More recently, both computing and biological evolutionary theory have started to place this coding neutrality in the bigger picture of the adaptive heuristic. Essentially, ? ndings appear to mirror Wright’s early arguments on the importance of genetic drift: redundancy in the code adds selectively neutral dimensions to the ? tness landscape that renders adaptive algorithms more effective by increasing the connectedness of local optima.

At present, an analogous reinterpretation is underway for a different adaptive feature of the genetic code: the observation that biochemically similar amino acids are assigned to codons that differ by only a single nucleotide. Early speculations that natural selection organized the genetic code so as to minimize the phenotypic impact of mutations have gained considerable evidential support as computer simulation enables exploration of theoretical codes that nature passed over. However, it seems likely that once again this phenomenon has more subtle effects in the broader context of the adaptive heuristic.

An “error minimizing code” may in fact maximize the probability that a random effects on both traits de? nes a circle of radius around the organism. The probability that this mutation will improve ? tness (i. e. , that the organism will move within the white area) is inversely proportional to its magnitude, mutation produces an increase in ? tness according to Geometric Theory of gradualism (Fig. 1. 6). Preliminary tests for this phenomenon reveal an even simpler in? uence: the error minimizing code smoothes the ? tness landscape where a random genetic code would render it rugged.

By clustering biochemically similar amino acids within mutational reach of one another it ensures that any selection towards a speci? c amino acid property (e. g. , hydrophobicity) will be towards an interconnected region of the ? tness landscape rather than to an isolated local optimum. 1. 3. 3 The Genotype/Phenotype Dichotomy Implicit to the concept of an adaptive genetic code is a deeper question that remains largely unanswered by biology: why does all known life use two qualitatively different polymers, nucleic acids and proteins, with the associated need for translation?

Current theories for the origin of this dichotomy focus on the discovery that RNA can act both as a genetic storage medium, and as a catalytic molecule. Within the 1. 4 Advantages of Evolutionary Computation 9 Fig. 1. 6 The ? tness landscape for an organism of 2 phenotypic traits: (a) for any organism, we may de? ne an isocline that connects all trait combinations of equal ? tness; (b) (the ? tness landscape from above): a random mutation of magnitude that has tradeoff most highly conserved core of metabolism, all known organisms are found to use RNA molecules in roles we normally attribute to proteins (White, 1976).

However, the answer to how the dichotomy evolved has largely eclipsed the question of why RNA evolved a qualitatively different representation for phenotype. A typical biological answer would be that the larger alphabet size of amino acids unleashed a greater catalytic diversity for the replicators, with an associated increase in metabolic sophistication that optimized self-replication. Interestingly, we know that nucleic acids are not limited to the 4 chemical letters we see today: natural metabolically active RNA’s utilize a vast repertoire of posttranscriptional modi? ations and synthetic chemistry has demonstrated that multiple additional nucleotide letters can be added to the genetic alphabet even with today’s cellular machinery. Furthermore, an increasing body of indirect evidence suggests that the protein alphabet itself underwent exactly the sort of evolutiona

A limited
time offer!
Save Time On Research and Writing. Hire a Professional to Get Your 100% Plagiarism Free Paper