Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Multi-Objective Optimization-Computer Sciences Applications-Project Report, Study Guides, Projects, Research of Applications of Computer Sciences

This project report is part of degree completion in computer science at Ambedkar University, Delhi. Its main points are: Multi-objective, Optimization, Process, Performance, Maximization, Genetic, Algorithm, Pareto-optimal, Crossbreeding

Typology: Study Guides, Projects, Research

2011/2012

Uploaded on 07/16/2012

sameer
sameer 🇮🇳

4.4

(59)

85 documents

1 / 26

Toggle sidebar

Related documents


Partial preview of the text

Download Multi-Objective Optimization-Computer Sciences Applications-Project Report and more Study Guides, Projects, Research Applications of Computer Sciences in PDF only on Docsity! Multi-Objective Optimization using Evolutionary Computation 1 Chapter 1 1 Introduction Objective is to design a user friendly toolbox for multi-objective optimization using evolutionary computation. Decision parameters will be optimized for conflicting objective problems. The objective of this project is to develop toolbox that is used for solving multi-objective optimization problems with the help of genetic algorithms. The toolbox will be first tested on the benchmark test problems. 1.1 Multi-Objective Optimization Multi-objective optimization is the processes of optimizing two are more conflicting objectives simultaneously that are subjected to certain constraints. Multi-objective optimization problems can be found in the fields where optimal decisions need to be taken in the presences of trade-offs between the two conflicting objectives. Examples of multi-objective optimization problems include minimization of the cost of the product as well as maximization of the profits, minimization of the fuel consumption of the vehicles and maximization of the performances etc. The general form of the multi-objective optimization problem is shown in equation 1.1: [1] (1.1) Where,  A solution x is a vector of n decision variables: x = (x1, x2, x3, …., xn) T  Last set of constraints are variable bound i.e. each decision variable xi should take a value within a lower xi (L) and an upper xi (U) bound. ( ) ( ) / ( ), 1,2,....., ; ( ) 0, 1,2,......., ; ( ) 0, 1,2,......, ; , m j k L U i i i Minimize Maximize f x m M subject to g x j J h x k K x x x               1,2,......., ;i n         docsity.com Multi-Objective Optimization using Evolutionary Computation 2  gi (x) and hk (x) are called constraint functions.  J and K are inequality and equality constraints respectively. Multi-objective optimization is sometimes called vector optimization because a vector of objectives is needed to optimize instead of single-objective. The solution to the above problem is the set of Pareto points. Therefore, instead of a unique solution to the problem, the solution of the multi-objective problem is a set of Pareto points. 1.2 Multi-Objective Optimization using Genetic Algorithm A general optimization problem consists of the number of objectives and also associated with the number of equality and inequality constraints. Solutions to the multi-objective optimization problem can be expressed in the form of non-dominated points mathematically. It means that the solution is dominant only if it has superior performance in all criteria. A solution is called Pareto-optimal if it cannot be dominated by any available solution in the given search space. There exists a most common difficulty with the multi-objective optimization problem which is the conflict between the two objectives. In general, none of the feasible solutions allow simultaneous optimal solution for all the objectives. Therefore, mathematically the most favorable Pareto-optimal is the solution which offers the least objective conflicts. [1] In order to find such solutions, classical methods are used to scalarize the objective vector into one single objective. The simplest of all the classical techniques is the weighted sum. It aggregates the objectives into single parameterized objective through objective through a linear combination of the objectives. However, selection of the appropriate weight vector also depends on the scaling of each objective function. As different objectives takes different order of magnitude, therefore when these objectives are weighted to form a composite objective function, it should be scaled appropriately in such a way that each of them must have same magnitude or order. But solution of this strategy largely depends on the weight vector. [1] For avoiding such kinds of difficulties, Pareto- based evolutionary optimization has become alternative to classical techniques, such as Weighted Sum Method. This approach was first proposed by the Goldberg and it explicitly used Pareto dominance in order to determine the reproduction probability of each individuals docsity.com Multi-Objective Optimization using Evolutionary Computation 5 2.3.2 Selection It selects individuals to contribute to the population at next generation, determine which individual is chosen for recombination and the number of offspring produced by each selected individual. At this stage, a proportion of the existing population is selected to breed a new generation. The fitness is calculated using a fitness function, and the fittest solutions are more likely to be selected. Important terms for comparing different selection schemes are: [6]  Selective Pressure It is the comparison of the probability of the best individual being selected to the average probability of selection of all individuals.  Bias It is the absolute difference between an individual's normalized fitness and its expected probability of reproduction.  Spread It is the range of possible values for the number of offspring of an individual.  Loss of Diversity It is the proportion of individuals of a population that is not selected during the selection phase.  Selection Intensity It is the expected average fitness value of the population after applying a selection method to the normalized Gaussian distribution.  Selection Variance It is the expected variance of the fitness distribution of the population after applying a selection method to the normalized Gaussian distribution. Commonly used methods for selection are:  Rank Based Fitness Assignment  Roulette wheel selection  Stochastic universal sampling  Local selection  Truncation selection  Tournament selection docsity.com Multi-Objective Optimization using Evolutionary Computation 6 Rank Based Fitness Assignment It sorts the population according to the objective values. Fitness value depends only on the position of each individual in the individual rank. It overcomes the scaling problem of proportional fitness assignment. It is more robust then proportional fitness assignment. Uniform scaling is introduced by ranking across the population. Ranking may be linear or non-linear. Linear Ranking Let Nind be the number of individuals in the population, Pos be the position of an individual in this population (least fit individual has Pos=1, the fittest individual Pos=Nind) and SP the selective pressure. The fitness value for an individual is calculated as in equation 2.1: [6] ( ) ( ) ( ) ( ) (2.1) Linear ranking allows values of selective pressure in the range [1.0, 2.0]. Non-Linear Ranking There exists another method for ranking using a non-linear distribution. The use of non-linear ranking permits higher selective pressures as compared to the linear ranking method. The fitness value for an individual is calculated as shown in equation 2.2: [6] ( ) ∑ – (2.2) Non-linear ranking allows values of selective pressure in the range [1.0, Nind - 2.0]. Figure 2.2 shows the comparison of linear ranking (for SP = 2) and non linear ranking (for SP = 3): docsity.com Multi-Objective Optimization using Evolutionary Computation 7 Figure 2.2 Comparison of Linear Ranking and Non-linear Ranking Properties of Linear Ranking: Properties of linear ranking are shown in figure 2.3. Figure 2.3 Properties of Linear Ranking Selection intensity: ( ) ( ) √ (2.3) Loss of diversity: ( ) ( ) (2.4) Selection variance: ( ) ( ) (2.5) 1 2 3 4 5 6 7 8 9 10 11 0 0.5 1 1.5 2 2.5 3 Position of Individuals F it n e s s Linear Ranking with SP = 2 Non linear Ranking with SP = 3 1 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Selective Pressure Properties of Linear Ranking Selection Variance Selection Intensity Loss of Diversity docsity.com Multi-Objective Optimization using Evolutionary Computation 10 ( ) √ ( ( ) (√ ( ))) (2.9) Loss of diversity: ( ) (2.10) (About 50% of the population are lost at tournament size Tour=5). Selection variance: ( ) ( ) (2.11) 2.3.3 Recombination It produces new individuals in combining the information contained in two or more parents. It is done by combining the variable values of the parents. Depending upon the representation of the variables, different methods must be used. These methods are: [6]  Discrete Recombination (for all type of representation)  Real valued Recombination  Binary Valued Recombination (Crossover) 2.3.4 Mutation By use of mutation, individuals are randomly altered. These variations are mostly small. They will be applied to the variables of the individuals with a low probability. Normally, off springs are mutated after being created by recombination. [6] Two approaches for mutation exist:  Real Valued Mutation  Binary Mutation 2.3.5 Reinsertion As the off springs have been produced from the selection, recombination and mutation of individuals from the old population, now the fitness of these off springs must be determined. If the new off springs produced are less then the size of the original population then to maintain the size of the original population, the off springs should be reinserted to the old population. Similarly, if more off springs are generated from the size of the original population, then also the reinsertion scheme is used to determine which individuals are to exist in the new population. docsity.com Multi-Objective Optimization using Evolutionary Computation 11 Reinsertion is further divided into two schemes: [6]  Global Reinsertion  Local Reinsertion docsity.com Multi-Objective Optimization using Evolutionary Computation 12 Chapter 3 3 Results and Conclusions In multi-objective evolutionary computation, many researchers have been used different test problems with some known Pareto-optimal solutions. Such types of problems also have been discussed by Veldhuizen [1999] in his doctoral thesis. Genetic algorithms were first proposed as multi-objective optimizer by Schaffer [1984]. However, the first Pareto-based multi-objective evolutionary algorithm to be published was the multi-objective genetic algorithm developed by Fonseca and Fleming [1993]. Genetic algorithms are appropriate search engines for multi-objective problems because of their population based approach. Let’s take Schaffer’s two objective problem which is single variable, shown in equation 3.1 [1]. Graph is shown in figure 3.1: { ( ) ( ) ( ) (3.1) Figure 3.1 Schaffer's Problem This problem has Pareto-optimal solutions in the range of [0, 2]. And the Pareto- optimal solution is convex. As it is a multi-objective problem, it cannot be solved -5 -4 -3 -2 -1 0 1 2 3 4 5 0 5 10 15 20 25 30 35 40 45 50 Variable x F ti n e s s f 1 , f2 f1 f2 docsity.com Multi-Objective Optimization using Evolutionary Computation 15 Figure 3.4 Plots for Remainder Selection First plot is for the best and the mean fitness. From figure 3.4, the best fitness value is 1 and the mean fitness value is 1. The second plot is for Average Distance between Individuals and the forth plot is for the Range. The third plot is for Range which plots the maximum, minimum and mean fitness value. The forth plot is for selection which plots the histogram of parents displaying which parents are contributing to each generation. From the above plot, it is shown that the best fitness value and the mean fitness value are equal to 1 because of the average distance between the individuals which becomes zero approximately after 30 th generation. 3.3 Results of Uniform Selection Technique By using Uniform Selection, the function converges approximately at 85 th generation with fitness value 1.000000090533225 and the final point of the fitness value is 1.0003. The result of the Schaffer’s problem by using uniform technique is displayed in four different graphs as shown in figure 3.5: 0 50 100 0 5 10 15 20 Generation F it n e s s v a lu e Best: 1 Mean: 1 20 40 60 80 100 0 2 4 6 Generation Average Distance Between Individuals 20 40 60 80 100 0 50 100 150 Generation Best, Worst, and Mean Scores 0 5 10 15 20 0 5 10 Selection Function Individual N u m b e r o f c h ild re n Best f itness Mean fitness docsity.com Multi-Objective Optimization using Evolutionary Computation 16 Figure 3.5 Plots for Uniform Selection First plot is for the Best Fitness. From figure 3.5, the best fitness value is 1 and the mean fitness value is 1. The second plot is for Average Distance between Individuals and the forth plot is for the Range. The third plot is for Range which plots the maximum, minimum and mean fitness value. The forth plot is for selection which plots the histogram of parents displaying which parents are contributing to each generation. The best fitness value and the mean fitness value are equal and is 1 because the average distance between the individuals becomes zero after approximately 85 th generation. 3.4 Results of Roulette Wheel Selection Technique By using Roulette Wheel Selection, the function converges at approximately 61 st generation with fitness value 1.0000000673207827 and the final point of the fitness value is 0.99974. The result of the Schaffer’s problem by using roulette wheel technique is displayed in four different graphs as shown in figure 3.6: 0 50 100 0 20 40 60 Generation F it n e s s v a lu e Best: 1 Mean: 1 20 40 60 80 100 0 2 4 6 8 Generation Average Distance Between Individuals 20 40 60 80 100 0 50 100 150 200 Generation Best, Worst, and Mean Scores 0 5 10 15 20 0 2 4 6 Selection Function Individual N u m b e r o f c h ild re n Best f itness Mean fitness docsity.com Multi-Objective Optimization using Evolutionary Computation 17 Figure 3.6 Plots for Roulette Wheel Selection First plot is for the Best Fitness. From figure 3.6, the best fitness value is 1 and the mean fitness value is 1. The second plot is for Average Distance between Individuals. The third plot is for Range which plots the maximum, minimum and mean fitness value. The forth plot is for selection which plots the histogram of parents displaying which parents are contributing to each generation. As the average distance between the individuals becomes zero approximately after 61 st generation, the best fitness value and the mean fitness value becomes equal which is 1. 3.5 Results of Tournament Selection Technique By using Tournament Selection, the function converges approximately at 51 st generation with fitness value 1.000000205995548 and the final point of the fitness value is 0.99955. The result of the Schaffer’s problem by using tournament technique is displayed in four different graphs as shown in figure 3.7: 0 50 100 0 10 20 30 40 Generation F it n e s s v a lu e Best: 1 Mean: 1 20 40 60 80 100 0 2 4 6 8 Generation Average Distance Between Individuals 20 40 60 80 100 0 50 100 150 Generation Best, Worst, and Mean Scores 0 5 10 15 20 0 2 4 6 Selection Function Individual N u m b e r o f c h ild re n Best f itness Mean fitness docsity.com Multi-Objective Optimization using Evolutionary Computation 20 Figure 3.9 Plots for Heuristic Crossover If the crossover function is Intermediate and the selection technique is remainder, the function converges approximately at 43 th generation with fitness value 0.9999999999999999 and the final point of the fitness value is 1. The result of the Schaffer’s problem by using remainder selection with intermediate crossover is displayed in four different graphs as shown in figure 3.10. The best fitness value and the mean fitness value is 1 because the average distance between the individuals becomes zero after approximately 43 th generation. 0 50 100 0 5 10 15 20 Generation F it n e s s v a lu e Best: 1 Mean: 1 20 40 60 80 100 0 2 4 6 Generation Average Distance Between Individuals 20 40 60 80 100 0 50 100 150 Generation Best, Worst, and Mean Scores 0 5 10 15 20 0 5 10 Selection Function Individual N u m b e r o f c h ild re n Best f itness Mean fitness docsity.com Multi-Objective Optimization using Evolutionary Computation 21 Figure 3.10 Plots for Intermediate Crossover 3.7 Results of Different Mutation Techniques If the mutation function is uniform, the selection technique is remainder, and the crossover technique is single point, the function converges approximately at 43 th generation with fitness value 1.0000002093605127 and the final point of the fitness value is 0.99954. The result of the Schaffer’s problem by using remainder selection with single point crossover and uniform mutation is displayed in four different graphs as shown in figure 3.11. The best fitness value and the mean fitness value is 1 because the average distance between the individuals becomes zero after approximately 43 th generation. 0 50 100 0 5 10 15 20 Generation F it n e s s v a lu e Best: 1 Mean: 1 20 40 60 80 100 0 2 4 6 Generation Average Distance Between Individuals 20 40 60 80 100 0 50 100 150 Generation Best, Worst, and Mean Scores 0 5 10 15 20 0 5 10 Selection Function Individual N u m b e r o f c h ild re n Best f itness Mean fitness docsity.com Multi-Objective Optimization using Evolutionary Computation 22 Figure 3.11 Plots for Unifrom Mutation If the mutation function is Adaptive Feasible, the selection technique is remainder, and the crossover technique is single point, the function converges approximately at 41 st generation with fitness value 1.0000001310160256 and the final point of the fitness value is 1.00036. The result of the Schaffer’s problem by using remainder selection with single point crossover and adaptive feasible mutation is displayed in four different graphs as shown in figure 3.12: 0 50 100 0 5 10 15 20 Generation F it n e s s v a lu e Best: 1 Mean: 1 20 40 60 80 100 0 2 4 6 8 Generation Average Distance Between Individuals 20 40 60 80 100 0 50 100 150 Generation Best, Worst, and Mean Scores 0 5 10 15 20 0 5 10 Selection Function Individual N u m b e r o f c h ild re n Best f itness Mean fitness docsity.com Multi-Objective Optimization using Evolutionary Computation 25 From table 3.3, Uniform mutation using single point crossover and remainder selection converges approximately after 28 th generation which is best in above compared mutation techniques. docsity.com Multi-Objective Optimization using Evolutionary Computation 26 Chapter 4 4 References [1] Kalyanmoy Deb, Multi-Objective Optimization using Evolutionary Algorithms, India, 2001 [2] Wikipedia, the free encyclopedia, URL:http://en.wikipedia.org/wiki/ [3] Kalyanmoy Deb, Multi-objective genetic algorithms: Problem difficulties and construction of test problems. Evolutionary Computation Journal, 7(3):205– 230, 1999. [4] Introduction to Genetic Algorithms, URL:http://www.rennard.org/alife/english/gavintrgb.html [5] “Evolutionary Algorithms: Genetic Algorithms, Evolutionary Programming and Genetic Programming”, URL:http://www.cs.sandia.gov/opt/survey/ea.html [6] ”GEATbx: Genetic and Evolutionary Algorithm Toolbox for use with MATLAB Documentation”, URL: http://www.geatbx.com/docu/index.html [7] S. Rajasekaram, G.A. Vijayalakshmi Pai, Neural Networks, Fuzzy Logic, and Genetic Algorithms: Synthesis and Applications, New Dehli, 2003 docsity.com
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved