A Gentle Introduction To Evolutionary Computation

2y ago
13 Views
2 Downloads
773.75 KB
112 Pages
Last View : 17d ago
Last Download : 3m ago
Upload by : Jenson Heredia
Transcription

'A Gentle Introduction to EvolutionaryComputationXin Yao (x.yao@cs.bham.ac.uk)1. What Is Evolutionary Computation2. Different Evolutionary Algorithms3. Major Areas Within Evolutionary Computation4. Summary&%

Yao: Intro to Evolutionary Computation 'What Is Evolutionary Computation1. It is the study of computational systems which use ideas andget inspirations from natural evolution.2. One of the principles borrowed is survival of the fittest.3. Evolutionary computation (EC) techniques can be used inoptimisation, learning and design.4. EC techniques do not require rich domain knowledge to use.However, domain knowledge can be incorporated into ECtechniques.&%

Yao: Intro to Evolutionary Computation 'A Simple Evolutionary Algorithm1. Generate the initial population P (0) at random, and seti 0;2. REPEAT(a) Evaluate the fitness of each individual in P (i);(b) Select parents from P (i) based on their fitness in P (i);(c) Generate offspring from the parents using crossover andmutation to form P (i 1);(d) i i 1;3. UNTIL halting criteria are satisfied&%

Yao: Intro to Evolutionary Computation' EA as Population-Based Generate-and-TestGenerate: Mutate and/or recombine individuals in a population.Test: Select the next generation from the parents and offsprings.&%

Yao: Intro to Evolutionary Computation 'How Does the Simple EA WorkLet’s use the simple EA to maximise the function f (x) x2 with xin the integer interval [0, 31], i.e., x 0, 1, · · · , 30, 31.The first step of EA applications is encoding (i.e., therepresentation of chromosomes). We adopt binary representationfor integers. Five bits are used to represent integers up to 31.Assume that the population size is 4.1. Generate initial population at random, e.g., 01101, 11000,01000, 10011. These are chromosomes or genotypes.2. Calculate fitness value for each individual.(a) Decode the individual into an integer (called phenotypes),01101 13, 11000 24, 01000 8, 10011 19;&%

Yao: Intro to Evolutionary Computation '(b) Evaluate the fitness according to f (x) x2 ,13 169, 24 576, 8 64, 19 361.3. Select two individuals for crossover based on their fitness. Ifroulette-wheel selection is used, thenfipi Pj fj.Two offspring are often produced and added to an intermediatepopulation. Repeat this step until the intermediate populationis filled. In our example,p1 (13) 169/1170 0.14p2 (24) 576/1170 0.49p3 (8) 64/1170 0.06p4 (19) 361.1170 0.31Assume we have crossover(01101, 11000) andcrossover(10011, 11000). We may obtain offspring 0110 0 and&%

Yao: Intro to Evolutionary Computation '1100 1 from crossover(01101, 11000) by choosing a randomcrossover point at 4, and obtain 10 000 and 11 011 fromcrossover(10011, 11000) by choosing a random crossover pointat 2. Now the intermediate population is01100, 11001, 10000, 110114. Apply mutation to individuals in the intermediate populationwith a small probability. A simple mutation is bit-flipping. Forexample, we may have the following new population P (1) afterrandom mutation:01101, 11001, 00000, 110115. Goto Step 2 if not stop.&%

Yao: Intro to Evolutionary Computation 'Different Evolutionary AlgorithmsThere are several well-known EAs with different historical backgrounds, representations, variation operators, and selection schemes.In fact, EAs refer to a whole family of algorithms, not a singlealgorithm.&%

Yao: Intro to Evolutionary Computation 'Genetic Algorithms (GAs)1. First formulated by Holland for adaptive search and by hisstudents for optimisation from mid 1960s to mid 1970s.2. Binary strings have been used extensively as individuals(chromosomes).3. Simulate Darwinian evolution.4. Search operators are only applied to the genotypicrepresentation (chromosome) of individuals.5. Emphasise the role of recombination (crossover). Mutation isonly used as a background operator.6. Often use roulette-wheel selection.&%

Yao: Intro to Evolutionary Computation 'Evolutionary Programming (EP)1. First proposed by Fogel et al. in mid 1960s for simulatingintelligence.2. Finite state machines (FSMs) were used to representindividuals, although real-valued vectors have always been usedin numerical optimisation.3. It is closer to Lamarckian evolution.4. Search operators (mutations only) are applied to thephenotypic representation of individuals.5. It does not use any recombination.6. Usually use tournament selection.&%

Yao: Intro to Evolutionary Computation 'Evolution Strategies (ES)1. First proposed by Rechenberg and Schwefel in mid 1960s fornumerical optimisation.2. Real-valued vectors are used to represent individuals.3. They are closer to Larmackian evolution.4. They do have recombination.5. They use self-adaptive mutations.&%

Yao: Intro to Evolutionary Computation 'Genetic Programming (GP)1. First used by de Garis to indicate the evolution of artificialneural networks, but used by Koza to indicate the applicationof GAs to the evolution of computer programs.2. Trees (especially Lisp expression trees) are often used torepresent individuals.3. Both crossover and mutation are used.&%

Yao: Intro to Evolutionary Computation' Preferred Term: Evolutionary Algorithms EAs face the same fundamental issues as those classical AIfaces, i.e., representation, and search. Although GAs, EP, ES, and GP are different, they are alldifferent variants of population-based generate-and-testalgorithms. They share more similarities than differences! A better and more general term to use is evolutionaryalgorithms (EAs).&%

Yao: Intro to Evolutionary Computation' Variation Operators and Selection SchemesCrossover/Recombination: k-point crossover, uniformcrossover, intermediate crossover, global discrete crossover, etc.Mutation: bit-flipping, Gaussian mutation, Cauchy mutation, etc.Selection: roulette wheel selection (fitness proportional selection),rank-based selection (linear and nonlinear), tournamentselection, elitism, etc.Replacement Strategy: generational, steady-state (continuous),etc.Specialised Operators: multi-parent recombination, inversion,order-based crossover, etc.&%

Yao: Intro to Evolutionary Computation' Major Areas in Evolutionary Computation1. Optimisation2. Learning3. Design4. Theory&%

Yao: Intro to Evolutionary Computation 'Evolutionary Optimisation1. Numerical (global) optimisation.2. Combinatorial optimisation (of NP-hard problems).3. Mixed optimisation.4. Constrained optimisation.5. Multiobjective optimisation.6. Optimisation in a dynamic environment (with a dynamicfitness function).&%

Yao: Intro to Evolutionary Computation 'Evolutionary LearningEvolutionary learning can be used in supervised, unsupervised andreinforcement learning.1. Learning classifier systems (Rule-based systems).2. Evolutionary artificial neural networks.3. Evolutionary fuzzy logic systems.4. Co-evolutionary learning.5. Automatic modularisation of machine learning systems byspeciation and niching.&%

Yao: Intro to Evolutionary Computation 'Evolutionary DesignEC techniques are particularly good at exploring unconventionaldesigns which are very difficult to obtain by hand.1. Evolutionary design of artificial neural networks.2. Evolutionary design of electronic circuits.3. Evolvable hardware.4. Evolutionary design of (building) architectures.&%

Yao: Intro to Evolutionary Computation 'Summary1. Evolutionary algorithms can be regarded as population-basedgenerate-and-test algorithms.2. Evolutionary computation techniques can be used inoptimisation, learning and design.3. Evolutionary computation techniques are flexible and robust.4. Evolutionary computation techniques are definitely useful toolsin your toolbox, but there are problems for which othertechniques might be more suitable.&%

Yao: Intro to Evolutionary Computation 'Global ure 1: Function f8 .&%

Yao: Intro to Evolutionary Computation 'Global Optimisation by Mutation-Based EAs1. Generate the initial population of µ individuals, and set k 1.Each individual is a real-valued vector, (xi .2. Evaluate the fitness of each individual.3. Each individual creates a single offspring: for j 1, · · · , n,xi 0 (j) xi (j) Nj (0, 1)(1)(2)where xi (j) denotes the j-th component of the vectors xi .N (0, 1) denotes a normally distributed one-dimensional randomnumber with mean zero and standard deviation one. Nj (0, 1)indicates that the random number is generated anew for eachvalue of j.&%

Yao: Intro to Evolutionary Computation' 4. Calculate the fitness of each offspring.5. For each individual, q opponents are chosen randomly from allthe parents and offspring with an equal probability. For eachcomparison, if the individual’s fitness is no greater than theopponent’s, it receives a “win.”6. Select the µ best individuals (from 2µ) that have the most winsto be the next generation.7. Stop if the stopping criterion is satisfied; otherwise, k k 1and go to Step 3.&%

Yao: Intro to Evolutionary Computation 'Why N (0, 1)?1. The standard deviation of the Normal distribution determinesthe search step size of the mutation. It is a crucial parameter.2. Unfortunately, the optimal search step size isproblem-dependent.3. Even for a single problem, different search stages requiredifferent search step sizes.4. Self-adaptation can be used to get around this problempartially.&%

Yao: Intro to Evolutionary Computation 'Function Optimisation by Classical EP (CEP)EP Evolutionary Programming1. Generate the initial population of µ individuals, and set k 1.Each individual is taken as a pair of real-valued vectors,(xi , ηi ), i {1, · · · , µ}.2. Evaluate the fitness score for each individual (xi , ηi ), i {1, · · · , µ}, of the population based on the objectivefunction, f (xi ).3. Each parent (xi , ηi ), i 1, · · · , µ, creates a single offspring(xi 0 , ηi 0 ) by: for j 1, · · · , n,&xi 0 (j) xi (j) ηi (j)Nj (0, 1),(3)ηi 0 (j) ηi (j) exp(τ 0 N (0, 1) τ Nj (0, 1))(4)%

Yao: Intro to Evolutionary Computation' where xi (j), xi 0 (j), ηi (j) and ηi 0 (j) denote the j-th componentof the vectors xi , xi 0 , ηi and ηi 0 , respectively. N (0, 1) denotes anormally distributed one-dimensional random number withmean zero and standard deviation one. Nj (0, 1) indicates thatthe random number is generated anew for each value of j. The³p 1factors τ and τ 0 have commonly set to2 nand¡ 1.2n4. Calculate the fitness of each offspring (xi 0 , ηi 0 ), i {1, · · · , µ}.5. Conduct pairwise comparison over the union of parents (xi , ηi )and offspring (xi 0 , ηi 0 ), i {1, · · · , µ}. For each individual, qopponents are chosen randomly from all the parents andoffspring with an equal probability. For each comparison, if theindividual’s fitness is no greater than the opponent’s, it receivesa “win.”&%

Yao: Intro to Evolutionary Computation' 6. Select the µ individuals out of (xi , ηi ) and (xi 0 , ηi 0 ), i {1, · · · , µ}, that have the most wins to be parents of thenext generation.7. Stop if the stopping criterion is satisfied; otherwise, k k 1and go to Step 3.&%

Yao: Intro to Evolutionary Computation' What Do Mutation and Self-Adaptation Do&%

Yao: Intro to Evolutionary Computation 'Fast EP The idea comes from fast simulated annealing. Use a Cauchy, instead of Gaussian, random number in Eq.(3)to generate a new offspring. That is,xi 0 (j) xi (j) ηi (j)δj(5)where δj is an Cauchy random number variable with the scaleparameter t 1, and is generated anew for each value of j. Everything else, including Eq.(4), are kept unchanged in orderto evaluate the impact of Cauchy random numbers.&%

Yao: Intro to Evolutionary Computation 'Cauchy DistributionIts density function isft (x) t1,22πt x x ,where t 0 is a scale parameter. The corresponding distributionfunction is³x 11Ft (x) arctan.2 πt&%

Yao: Intro to Evolutionary Computation 'Gaussian and Cauchy Density Functions0.4N(0,1)Cauchy, t 10.350.30.250.20.150.10.050-4&-2024%

Yao: Intro to Evolutionary Computation 'Test Functions 23 functions were used in our computational studies. Theyhave different characteristics. Some have a relatively high dimension. Some have many local optima.&%

Yao: Intro to Evolutionary Computation 'f945403530252015105010.50-1-0.5-0.500.51-1Figure 2: Function f9 at a closer look.&%

Yao: Intro to Evolutionary Computation 50-10.5-1.511.52-2Figure 3: Function f18 at a closer look.&%

Yao: Intro to Evolutionary Computation 'Experimental Setup Population size 100. Competition size 10 for selection. All experiments were run 50 times, i.e., 50 trials. Initial populations were the same for CEP and FEP.&%

Yao: Intro to Evolutionary Computation' Experiments on Unimodal FunctionsFNo. ofFEPCEPFEP CEPGen’sMean BestStd DevMean BestStd Devt-testf115005.7 10 41.3 10 42.2 10 45.9 10 44.06†f220008.1 10 37.7 10 42.6 10 31.7 10 449.83†f350001.6 10 21.4 10 25.0 10 26.6 10 2 3.79†f450000.30.52.01.2 8.25†f5200005.065.876.1713.61 0.52f6150000577.761125.76 3.67†f730007.6 10 32.6 10 31.8 10 26.4 10 3 10.72††The value of t with 49 degrees of freedom is significant at α 0.05 by atwo-tailed test.&%

Yao: Intro to Evolutionary Computation 'Discussions on Unimodal Functions FEP performed better than CEP on f3 –f7 . CEP was better for f1 and f2 . FEP converged faster, even for f1 and f2 (for a long period).&%

Yao: Intro to Evolutionary Computation' Experiments on Multimodal Functions f8 –f13FNo. ofFEPCEPFEP CEPGen’sMean BestStd DevMean BestStd Devt-testf89000 12554.552.6 7917.1634.5 51.39†f950004.6 10 21.2 10 289.023.1 27.25†f1015001.8 10 22.1 10 39.22.8 23.33†f1120001.6 10 22.2 10 28.6 10 20.12 4.28†f1215009.2 10 63.6 10 61.762.4 5.29†f1315001.6 10 47.3 10 51.43.7 2.76††The value of t with 49 degrees of freedom is significant at α 0.05 by atwo-tailed test.&%

Yao: Intro to Evolutionary Computation' Discussions on Multimodal Functions f8 –f13 FEP converged faster to a better solution. FEP seemed to deal with many local minima well.&%

Yao: Intro to Evolutionary Computation' Experiments on Multimodal Functions f14 –f23Ff14f15f16f17f18f19f20f21f22f23No. ofGen’s1004000100100100100200100100100FEPMean BestStd Dev1.220.565.0 10 4 3.2 10 4 1.034.9 10 70.3981.5 10 73.020.11 3.861.4 10 5 3.275.9 10 2 5.521.59 5.522.12 6.573.14CEPMean BestStd Dev1.661.194.7 10 4 3.0 10 4 1.034.9 10 70.3981.5 10 73.00 3.861.4 10 2 3.285.8 10 2 6.862.67 8.272.95 9.102.92FEP CEPt-test 2.21†0.490.00.01.0 1.00.453.56†5.44†4.24††The value of t with 49 degrees of freedom is significant at α 0.05 by atwo-tailed test.&%

Yao: Intro to Evolutionary Computation' Discussions on Multimodal Functions f14 –f23 The results are mixed! FEP and CEP performed equally well on f16 and f17 . They arecomparable on f15 and f18 –f20 . CEP performed better on f21 –f23 (Shekel functions). Is it because the dimension was low so that CEP appeared tobe better?&%

Yao: Intro to Evolutionary Computation' Experiments on Low-Dimensional f8 –f13FNo. ofGen’sFEPCEPFEP CEPMean BestStd DevMean BestStd Devt-testf8500-2061.7458.79-1762.45176.21 11.17†f94000.140.404.083.08 8.89†f104008.6 10 41.8 10 48.1 10 20.34 1.67f1115005.3 10 24.2 10 20.140.12 4.64†f122001.5 10 71.2 10 72.5 10 20.12 1.43f132003.5 10 71.8 10 73.8 10 31.4 10 2 1.89†The value of t with 49 degrees of freedom is significant at α 0.05 by atwo-tailed test.&%

Yao: Intro to Evolutionary Computation 'Discussions on Low-Dimensional f8 –f13 FEP still converged faster to better solutions. Dimensionality does not play a major role in causing thedifference between FEP and CEP. There must be something inherent in those functions whichcaused such difference.&%

Yao: Intro to Evolutionary Computation' The Impact of Parameter t on FEP — Part ITable 1: The mean best solutions found by FEP using different scale parametert in the Cauchy mutation for functions f1 (1500), f2 (2000), f10 (1500), f11 (2000),f21 (100), f22 (100) and f23 (100). The values in “()” indicate the number ofgenerations used in FEP. All results have been averaged over 50 runs.Functiont 0.0156t 0.0313t 0.0625t 0.1250t 0.2500f11.04350.05990.00381.5 10 46.5 10 5f23.8 10 43.1 10 45.9 10 .01210.22370.10930.07400.0368f21 6.9236 7.7261 8.0487 8.6473 8.0932f22 7.9211 8.3719 9.1735 9.8401 9.1587f23 7.8588 8.6935 9.4663 9.2627 9.8107&%

Yao: Intro to Evolutionary Computation' The Impact of Parameter t on FEP — Part IITable 2: The mean best solutions found by FEP using different scale parameter t in the Cauchy mutation for functions f1 (1500), f2 (2000), f10 (1500),f11 (2000), f21 (100), f22 (100) and f23 (100). The values in “()” indicate thenumber of generations used in FEP. All results have been averaged over 50runs.Functiont 0.5000t 0.7500t 1.0000t 1.2500t 1.5000f11.8 10 43.5 10 45.7 10 48.2 10 .0121f21 6.6272 5.2845 5.5189 5.0095 5.0578f22 7.6829 6.9698 5.5194 6.1831 5.6476f23 8.5037 7.8622 6.5713 6.1300 6.5364

Yao: Intro to Evolutionary Computation' Why Cauchy Mutation Performed BetterGiven G(0, 1) and C(1), the expected length of Gaussian andCauchy jumps are:Z 1 x212 0.399EGaussian (x) x edx 2π2π0Z 1xECauchy (x) dx 2)π(1 x0It is obvious that Gaussian mutation is much localised than Cauchymutation.&%

Yao: Intro to Evolutionary Computation 'Why and When Large Jumps Are Beneficial(Only 1-d case is considered here for convenience’s sake.)Take the Gaussian mutation with G(0, σ 2 ) distribution as anexample, i.e.,21 x2fG(0,σ2 ) (x) e 2σ ,σ 2π x ,the probability of generating a point in the neighbourhood of theglobal optimum x is given byZ x ²PG(0,σ2 ) ( x x ²) fG(0,σ2 ) (x)dx(6)x ²where ² 0 is the neighbourhood size and σ is often regarded asthe step size of the Gaussian mutation. Figure 4 illustrates thesituation.&%

Yao: Intro to Evolutionary Computation 'f(x)probability density function of x0xx*x*- εx * εx *- ε δFigure 4: Evolutionary search as neighbourhood search, where x isthe global optimum and ² 0 is the neighbourhood size.&%

Yao: Intro to Evolutionary Computation 'An Analytical ResultIt can be shown that PG(0,σ2 ) ( x x ²) 0 σwhen x ² δ σ. That is, the larger σ is, the largerPG(0,σ2 ) ( x x ²) if x ² δ σ.On the other hand, if x ² δ σ, then PG(0,σ2 ) ( x x ²) 0, σwhich indicates that PG(0,σ2 ) ( x x ²) decreases, exponentially,as σ increases.&%

Yao: Intro to Evolutionary Computation' Empirical Evidence ITable 3: Comparison of CEP’s and FEP’s final results on f21 when the initialpopulation is generated uniformly at random in the range of 0 xi 10and 2.5 xi 5.5. The results were averaged over 50 runs. The number ofgenerations for each run was 100.Initial RangeFEPCEPFEP CEPMean BestStd DevMean BestStd Devt-test2.5 xi 5.5 5.621.71 7.902.854.58†0 xi 10 5.571.54 6.862.942.94†t-test‡ 0.16 1.80††The value of t with 49 degrees of freedom is significant at α 0.05 by atwo-tailed test. ‡ FEP(CEP)small FEP(CEP)normal .&%

Yao: Intro to Evolutionary Computation' Empirical Evidence IITable 4: Comparison of CEP’s and FEP’s final results on f21 when the initialpopulation is generated uniformly at random in the range of 0 xi 10and 0 xi 100 and ai ’s were multiplied by 10. The results were averagedover 50 runs. The number of generations for each run was 100.Initial RangeFEPCEPFEP CEPMean BestStd DevMean BestStd Devt-test0 xi 100 5.803.21 5.592.97 0.400 xi 10 5.571.54 6.862.942.94†t-test‡ 0.482.10††The value of t with 49 degrees of freedom is significant at α 0.05 by atwo-tailed test. ‡ FEP(CEP)small FEP(CEP)normal .&%

Yao: Intro to Evolutionary Computation 'Summary1. Cauchy mutation performs well when the global optimum is faraway from the current search location. Its behaviour can beexplained theoretically and empirically.2. An optimal search step size can be derived if we know wherethe global optimum is. Unfortunately, such information isunavailable for real-world problems.3. The performance of FEP can be improve by more suitableparameters, instead of copying CEP’s parameter setting.Reference1. X. Yao, Y. Liu and G. Lin, “Evolutionary programming madefaster,” IEEE Transactions on Evolutionary Computation,3(2):82-102, July 1999.&%

Yao: Intro to Evolutionary Computation 'Search Step Size and Search Bias1. The search step size of mutation is crucial in deciding thesearch performance.2. In general, different search operators have different search stepsizes, and thus appropriate for different problems as well asdifferent evolutionary search stages for a single problem.3. Search bias of an evolutionary search operator includes its stepsize and search directions. Search bias of a search operatordetermines how likely an offspring will be generated from aparent(s).&%

Yao: Intro to Evolutionary Computation' Mixing Search Biases by Self-adaptation1. Since the global optimum is unknown in real-world applications, it isimpossible to know a priori what search biases we should use in EAs.One way to get around this problem is to use a variety of differentbiases and allow evolution to find out which one(s) are more promisingthan others.2. Rather than using either Gaussian or Cauchy mutations, we can useboth. That is, two candidate offspring will be generated from everyparent, one by Gaussian mutation and one by Cauchy mutation. Thefitter one will survive as the single child.3. The experimental results show that the improved fast EP (IFEP) iscapable of performing as well as or better than the better one of FEPand CEP for most of the chosen test functions. This is achievedthrough a minimal change to the existing FEP and CEP.&%

Yao: Intro to Evolutionary Computation '35# of CauchyNumber of successful Cauchy mutations3025201510502004006008001000 1200 1400 1600GenerationFigure 5: Number of successful Cauchy mutations in a populationwhen IFEP is applied to function f10 . The vertical axis indicatesthe number of successful Cauchy mutations in a population and thehorizontal axis indicates the number of generations. The results havebeen averaged over 50 runs.&%

Yao: Intro to Evolutionary Computation '17# of CauchyNumber of successful Cauchy erationFigure 6: Number of successful Cauchy mutations in a populationwhen IFEP is applied to function f21 . The vertical axis indicatesthe number of successful Cauchy mutations in a population and thehorizontal axis indicates the number of generations. The results havebeen averaged over 50 runs.&%

Yao: Intro to Evolutionary Computation 'Other Mixing MethodsMean mutation operator: Takes the average of the twomutations.x0i (j) xi (j) ηi (j) (0.5(Nj (0, 1) Cj (1)))where Nj (0, 1) is a normally distributed number while Cj (1)follows Cauchy distribution with parameter 1.Adaptive mutation operator: It’s actually a self-adaptivemethod.x0i (j) xi (j) η1i (j)Nj (0, 1) η2i (j)Cj (1)where both η1i (j) and η2i (j) are self-adaptive parameters.&%

Yao: Intro to Evolutionary Computation 'A More General Self-Adaptive Method1. The idea of mixing can be generalised to Lévy mutation.2. Lévy probability distribution can be tuned to generate anydistribution between the Gaussian and Cauchy probabilitydistributions.3. Hence we can use Lévy mutation with different parameters inEAs and let evolution to decide which one to use.&%

Yao: Intro to Evolutionary Computation 'An Anomaly of Self-adaptation in 010function value10000Figure 7: The 30-d sphere model stagnates early from mean of 50runs.&%

Yao: Intro to Evolutionary Computation 'Why EP Stagnates EarlyTable 5: The 19-th component and the fitness of the best individualin a typical run.P1Generation(x1 (19), η1 (19))f (x1 )µf (xi ):300(-14.50, 4.52E :600(-14.50, 8.22E 6):1000(-14.50, 1.33E 8):1500&(-14.50, 1.86E 12)%

Yao: Intro to Evolutionary Computation 'Getting Around the AnomalySetting a lower bound! For example, set a fixed lower bound, e.g.,10 3 .&%

Yao: Intro to Evolutionary Computation 'Use Success Rate to Adjust Lower Boundsµt 1t η η StA¶,(7)where St is the success rate at generation t and A is a referenceparameter, which has been set between 0.25 and 0.45 in ourexperiments. The success rate St is obtained by first computing thenumber of offspring selected for the next generation and thentaking the ratio of successes to all offspring.&%

Yao: Intro to Evolutionary Computation 'Use Mutation Step Size to Adjust Lower BoundsUse the median of the mutation step size from all accepted(successful) offspring as the new lower bound for the nextgeneration. Let δi (j) ηi0 (j)Nj (0, 1). We first calculate the averagemutation step size from all accepted (successful) offspring:m1 Xδ(j) δv (j), j 1, · · · , n,m v 1where m is the number of the accepted offspring. Then, the lowerbound of η for the next generation ist 1η median{δ(j), j 1, 2, . . . , n}.&(8)%

Yao: Intro to Evolutionary Computation' Getting Around the Anomaly — RecombinationIntermediate recombination helps because it averages out extremelysmall step sizes.&%

Yao: Intro to Evolutionary Computation 'Representation Is ImportantSearch and representation are fundamental to evolutionary search.They go hand-in-hand.1. Binary strings have often been used to represent individuals,e.g., integers and real numbers. However, they may not begood representations, because binary encoding of an integer orreal number can introduce so-called Hamming cliffs.2. Gray coding can help, but does not solve the problem entirely.A better representation is to use integers or real numbersthemselves.&%

Yao: Intro to Evolutionary Computation 'Gray Code&integerbinary codeGray 1017111100%

Yao: Intro to Evolutionary Computation 'Adaptive Representation1. Although we have been using the Cartesian coordinates in allour examples so far, there are cases where a differentrepresentation would be more appropriate, e.g., polarcoordinates.2. The idea of self-adaptation can also be used in representations,where the most suitable representation will be evolved ratherthan fixed in advance.3. For example, Cartesian and polar representations can be mixedadaptively in an EAs so that evolution can choose whichrepresentation is the best in the current stage of evolutionarysearch.&%

Yao: Intro to Evolutionary Computation 'Summary1. Search step size is a crucial factor in determining EA’sperformance.2. Different operators, and EAs in general, have different searchbiases.3. Mixing different operators and representations adaptively canlead to better performance for many (but not all) problems.4. However, cares must be taken as self-adaptation does notalways work as claimed.&%

Yao: Intro to Evolutionary Computation' References1. X. Yao, Y. Liu and G. Lin, “Evolutionary programming made faster,”IEEE Transactions on Evolutionary Computation, 3(2):82-102, July1999.2. K. H. Liang, X. Yao and C. S. Newton, “Adapting self-adaptiveparameters in evolutionary algorithms,” Applied Intelligence,15(3):171-180, November/December 2001.3. T. Schnier and X. Yao, “Using Multiple Representations inEvolutionary Algorithms,” Proceedings of the 2000 Congress onEvolutionary Computation, IEEE Press, Piscataway, NJ, USA, July2000. pp.479-486.4. K. Chellapilla, “Combining mutation operators in evolutionaryprogramming,” IEEE Transactions on Evolutionary Computation,2(3):91-96, Sept. 1998.The first three are downloadable from my web pages.&%

Yao: Intro to Evolutionary Computation' Two Approaches to Evolutionary LearningMichigan Approach: Holland-style learning classifier systems(LCS), where each individual is a rule. The whole population isa complete (learning) system.Pitt Approach: Each individual is a complete system.This talk deals only with the Pitt-style evolutionary learning sinceit is more widely used.&%

Yao: Intro to Evolutionary Computation 'Current Practice in Evolutionary LearningPitt Style Evolutionary Learningbest individual"genetic" operatorscorssovermutation.a population of individuals(learning systems, e.g., ANNs orrule-based systems)fitness evaluationandselectionFigure 8: A general framework for Pitt style evolutionary learning.&%

Yao: Intro to Evolutionary Computation 'Fitness Evaluation1. Based on the training error.2. Based on the training error and complexity (regularisation),i.e.,1 error α complexityf itness&%

Yao: Intro to Evolutionary Computation' Evolutionary Learning and Optimisation Learning has often been formulated as an optimisationproblem. However, learning is different from optimisation.1. In optimisation, the fitness function reflects what is needed.The optimal value is always better than the second optimal one.2. In learning, there is no way to quantify generalisation exactly.A system with minimum training error may not be the onewith the best generalisation ability. Why select the “best”individual in a population as the final output?&%

Yao: Intro to Evolutionary Computation' Exploit Useful Information in a Population Since an individual with the minimum training error may notbe the one with best generalisation, it makes sense to exploituseful information in a population rather than any singleindividual. All in all, it is a population that is evolving, not a singleindividual. Two types of experiments have been carried out to show thatpopulation does contain more information

Yao: Intro to Evolutionary Computation ’ & % What Is Evolutionary Computation 1. It is the study of computational systems which use ideas and get inspirations from natural evolution. 2. One of the principles borrowed is survival of the fittest. 3. Evolutionary computation (EC) techniques can be used in optimisation, learning and design. 4.

Related Documents:

evolutionary biology. From this point of view, some authors have tried to extend the Darwinian theory universally beyond the domain of evolutionary biology (Cf. Dawkins, 1983), using the three principles of evolutionary theory (inheritance or retention, variation, and adaptation) as a heuristic for evolutionary economic theorizing (Campbell, 1965).

Swarm Robotics Distributed Embodied Evolutionary Robotics (Evolutionary) Swarm Robotics: a gentle introduction Inaki Fern andez P erez inaki.fernandez@loria.fr

1.1.3 Evolutionary Design by Computers So it is clear that evolutionary design in nature is capable of generating astonishingly in-novative designs. This book demonstrates how evolutionary design by computers is also cap-able of such innovation. To achieve this, the highest achievers in evolutionary design have

NATURE OF HUMAN INTELLIGENCE Leda Cosmides* and John Tooby* EVOLUTIONARY PSYCHOLOGY The goal of research in evolutionary psychology is to discover and understand the de- sign of the human mind. Evolutionary psychology is an approach to psychology, in which knowledge and principles from evolutionary biology and human evolutionary

data into studies of eco-evolutionary dynamics can provide a better mechanistic understanding of the causes of phenotypic change, help elucidate the mechanisms driving eco-evolutionary dynamics and lead to more accurate evolutionary predictions of eco-evolutionary dynamics in nature (Fig. 2). What genomic data can reveal about

Advanced Gentle Yoga Teacher Training Manual GENTLE, SENIOR AND CHAIR YOGA TRAINING MANUAL - VOLUME 5 Chair and Senior Yoga, Gentle Yoga Therapy and Somatic Yoga Along with "Yoga Business" Guest Presenters With Sherry Zak Morris, E-RYT; Paula Montalvo, RYT; Justine Shelton, E-RYT500 and Certified Viniyoga Therapist

Extreme Programming: A Gentle Introduction. Extreme Programming: A gentle introduction. The goal of this site is to provide an introduction and overview of Extreme Programming (XP). For a guided tour of XP follow the trail of little buttons, starting here. Returning visitors can jump to recent changes to see what's new.

Studying astrology can evoke changes in how we see life and experience the world, and in our lives, and for this reason it is important that students take their time with their studies and view study as a journey rather than a destination. There are times where there is greater studying activity and other times of greater reflection or adjustment, both of which are of immense value. It is .