Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What's differential evolution and how does it compare to a genetic algorithm?

Tags:

From what I've read so far they seem very similar. Differential evolution uses floating point numbers instead, and the solutions are called vectors? I'm not quite sure what that means. If someone could provide an overview with a little bit about the advantages and disadvantages of both.

like image 347
Undefined Avatar asked Feb 29 '12 21:02

Undefined


People also ask

Is differential evolution a genetic algorithm?

Conclusion. Differential Evolution differs from standard genetic algorithms in that it utilizes directional information within the population through the usage of a target and unit vector. These capabilities allow differential evolution to converge faster to solutions at the cost of poor exploration.

What are the differences between evolution strategies and genetic algorithms?

In evolution strategies, the individuals are coded as vectors of real numbers.. The step size or "mutation strength" is encoded in the individual, so good parameters get to the next generation by selecting good individuals. In genetic algorithms, the individuals are coded as integers.

What is differential evolution and why do we use this algorithm?

Differential evolution (DE) is a population-based metaheuristic search algorithm that optimizes a problem by iteratively improving a candidate solution based on an evolutionary process. Such algorithms make few or no assumptions about the underlying optimization problem and can quickly explore very large design spaces.

What is genetic algorithm in simple words?

The genetic algorithm is a method for solving both constrained and unconstrained optimization problems that is based on natural selection, the process that drives biological evolution. The genetic algorithm repeatedly modifies a population of individual solutions.


2 Answers

Well, both genetic algorithms and differential evolution are examples of evolutionary computation.

Genetic algorithms keep pretty closely to the metaphor of genetic reproduction. Even the language is mostly the same-- both talk of chromosomes, both talk of genes, the genes are distinct alphabets, both talk of crossover, and the crossover is fairly close to a low-level understanding of genetic reproduction, etc.

Differential evolution is in the same style, but the correspondences are not as exact. The first big change is that DE is using actual real numbers (in the strict mathematical sense-- they're implemented as floats, or doubles, or whatever, but in theory they're ranging over the field of reals.) As a result, the ideas of mutation and crossover are substantially different. The mutation operator is modified so far that it's hard for me to even see why it's called mutation, as such, except that it serves the same purpose of breaking things out of local minima.

On the plus side, there are a handful of results showing DEs are often more effective and/or more efficient than genetic algorithms. And when working in numerical optimization, it's nice to be able to represent things as actual real numbers instead of having to work your way around to a chromosomal kind of representation, first. (Note: I've read about them, but I've not messed extensively with them so I can't really comment from first hand knowledge.)

On the negative side, I don't think there's been any proof of convergence for DEs, yet.

like image 94
Novak Avatar answered Nov 06 '22 18:11

Novak


Differential evolution is actually a specific subset of the broader space of genetic algorithms, with the following restrictions:

  • The genotype is some form of real-valued vector
  • The mutation / crossover operations make use of the difference between two or more vectors in the population to create a new vector (typically by adding some random proportion of the difference to one of the existing vectors, plus a small amount of random noise)

DE performs well for certain situations because the vectors can be considered to form a "cloud" that explores the high value areas of the solution solution space quite effectively. It's pretty closely related to particle swarm optimization in some senses.

It still has the usual GA problem of getting stuck in local minima however.

like image 43
mikera Avatar answered Nov 06 '22 16:11

mikera