Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Genetic Algorithm selection and crossover

I have been doing some research on genetic algorithms for a project in my ai class but I am a little confused as to what seems to be the traditional algorithm.

Basically, I wonder why they use different selections like roulette wheel to choose parents to reproduce. Why not choose the parents with the best fitness score and call it a day?

Also crossover confuses me as well. It randomly choose points every time to splice parent information. But it would seem to make more sense for crossovers to change based on previous info. If a chromosome string is known to good up to a point, the crossover could still be random but not in the range of the good part in the string.

Any thoughts?

like image 958
user1013905 Avatar asked Nov 12 '11 17:11

user1013905


People also ask

What are selection and crossover genetic operators?

Selection determines individuals to be used for the second genetic operator-crossover or recombination, where from two good individuals, a new and even better individual is constructed. The crossover process is re- peated until the whole new population is completed with the offspring produced by the crossover.

How do you select crossover points in genetic algorithm?

Two random points are chosen on the individual chromosomes (strings) and the genetic material is exchanged at these points. Uniform Crossover : Each gene (bit) is selected randomly from one of the corresponding genes of the parent chromosomes. Use tossing of a coin as an example technique.

What is selection function in genetic algorithm?

Selection is the stage of a genetic algorithm in which individual genomes are chosen from a population for later breeding (using the crossover operator).

Why is crossover used in genetic algorithm?

The process of crossover ensures the exchange of genetic material between parents and thus creates chromosomes that are more likely to be better than the parents.


1 Answers

Selection

If you only ever choose the best parent, what you get is Hill climbing. Hill climbing works nicely, but the more difficult the problem, generally, the more likely you are going to get stuck in a position you can make no further progress from.

Generally, the harder the problem, the more such local optima there are. Selecting other individuals in addition to the best ones maintains the diversity of the population: the solutions are spread further out in the search space, and if a part of the population is stuck in a local optimum, a different part of the population can still make progress.

Modern genetic algorithms usually devote a lot of effort to maintaining the diversity of the population to prevent premature convergence. One technique for that is fitness sharing. Another simple way to do this, is to divide the population into different species, so that individuals of different species can't (or only rarely can) reproduce with each other.

Crossover

Crossover tries to distribute good parts of the genome among individuals that have arisen due to mutation. It would indeed be nice if one could just swap the good parts of the genome, and this has been attempted; for example, you can look at each gene and measure the average fitness of individuals possessing that gene.

There are a two main problems with this though:

  1. It is computationally expensive.

  2. There might be interdependencies in the genome. Maybe gene A looks really good according to your metric, but gene B doesn't, so you leave it out. In reality though, it might be that gene A doesn't actually work without gene B being present.

like image 172
DataWraith Avatar answered Sep 23 '22 09:09

DataWraith