
|
|
Prof.
Dr. rer. nat. habil. Hans-Georg Beyer
|
Recent Talks and Tutorials
1. Evolution Strategies are Not Gradient Followers
In order to explain the working principles of Evolution Strategies (ESs)
in real-valued search spaces, sometimes the picture of a (stochastic)
gradient approximating strategy is invoked. There are publications in
the field of machine learning and evolutionary algorithms where this
misleading picture is promoted. Therefore, I gave a talk at
Dagstuhl,
Seminar 19431 (Oct. 20 - 25, 2019),
showing that this picture is not correct: ESs are much more explorative than
gradient strategies, thus they have a certain chance of not being trapped in the
next local attractor. The slides of that talk can
be obtained here.
BTW, even the consideration of ESs as Monte-Carlo approximators of the
so-called natural gradient does not hold for standard ESs such as the
Covariance Matrix Adapation ES. A discussion of that topic can be found in my paper
Convergence Analysis of Evolutionary Algorithms
That are Based on the Paradigm of Information Geometry. While the main
part of that paper is rather technical, the Introduction and the Conclusions
should be easy to follow.
2. Design Principles for Matrix Adaptation Evolution Strategies
In the paper Simplify Your
Covariance Matrix Adaptation Evolution Strategy we have shown that one can
simplify this well-performing ES by removing the covariance matrix totally
from the CMA-ES without performance degradation. As a result one obtains simpler
Evolution Strategies that allow for further algorithm engineering addressing
high-dimensional search spaces and constrained optimization problems.
Here are tutorial slides
discussing these topics.
Matlab/Octave code of the basic algorithms can be found at
Downloads
last change: 14.07.2020
|