Posts in category Algorithms

Grammatical Evolution in HeuristicLab

The latest version of HeuristicLab now also includes a plugin for grammatical evolution (GE)! The plugin has been implemented by Sabine Winkler for her Bachelor's thesis, and she kindly contributed her implementation to HeuristicLab, so that we could integrate it into the stable development branch.

So far, the implementation supports two problems: symbolic regression and the artifical ant problem for the Santa Fe trail. However, additional problem types for grammatical evolution can be added easily in a similar way as for tree-based GP (see our documentation on implementing GP problems)

The implementation of GE reuses existing classes and methods for tree-based GP and acts mainly as a translator between the integer vector encoding and the tree encoding. In the mapping process the grammar is used to build a tree from the integer vector. The concept of the mapping process is shown in the following diagram:

The GE plugin also provides different multiple variants of mappers that have been described in literature including depth-first, breath-first, and the π-GE mappers. In GE the genotype to phenotype mappers use a grammar to create a tree from an integer vector. The elements of the integer vector are read from left to right and the tree is built starting from the root symbol of the grammar. For each non-terminal symbol the mapper reads one element from the vector and chooses the alternative for extending the tree based on this value. An example for the depth-first mapper is shown in the following figure.

In GE the genotype to phenotype mapping is prone to the so-called ripple-effect which describes the phenomenon that a small change of the integer-vector might lead to a completely different tree after mapping. This hampers the search process and could lead to premature convergence of the algorithm.

We are happy about contributions to the grammatical evolution implementation or to HeuristicLab in general. So, if you have implemented an algorithm or problem or even only a custom operator feel free to contribute!

Figures (C) Sabine Winkler, with kind permission.

CMA-ES Implementation in HeuristicLab

We are working on providing an implementation of CMA-ES in HeuristicLab for real-valued optimization. The developments are tracked in ticket #1961 and the implementation is based on Hansen's Java implementation. Porting that code to C# is now almost complete and the integration into HeuristicLab looks very nice. Although most of the original code parts could be preserved the integration required to break some parts apart and reassemble them in a different way. I was very pleased to see Hansen providing "trusted output" image samples for implementers to check if their version performed similar. Here is a comparison of HeuristicLab's implementation (left) with the trusted output images (right).

Sphere

CMA-ES performance on 10-D Sphere test function https://www.lri.fr/~hansen/fsphere10D.png

Rosenbrock

CMA-ES performance on 10-D Rosenbrock test function https://www.lri.fr/~hansen/frosen10D.png

Rastrigin

CMA-ES performance on 10-D Rastrigin test function https://www.lri.fr/~hansen/frast10D.png

The plugin is still not part of the trunk, but available in a branch which can be compiled by anyone. I attach a current binary version of the plugin to this post if you want to try it (add it to the latest daily build, unblock the dll after you download it). CMA-ES currently can be used to solve any problem that is based on the Encodings.RealVector plugin and uses an IRealVectorCreator as solution creator. Of course the implementation may still change and may break files.