We are happy to announce the release of HeuristicLab 3.3.14, which we finished at this year's Genetic and Evolutionary Computation Conference (GECCO) in Denver, CO, USA.
HeuristicLab 3.3.14 "Denver" contains the following new features:
- New problems:
- Bin Packing
- Probabilistic TSP
- Multi-Objective Testfunctions
- New data analysis algorithms:
- Gradient Boosted Regression
- Nonlinear Regression
- Gradient Charts
For a full list of all changes have a look at the ChangeLog.
Go to the Download page or click on the image below to get HeuristicLab 3.3.14 "Denver" now!
Right before our annual HeuristicLab retreat (this time in Windischgarsten, Austria), we are proud to announce the release of HeuristicLab 3.3.13 "Windischgarsten" with the following new features:
- New Algorithms:
- Age-layered Population Structure (ALPS)
- Offspring Selection Evolution Strategy (OSES)
- New Problems:
- Multi-objective external evaluation problem
- Gentic programming for code generation (Robocode)
- Further genetic programming problems: Even Parity, Multiplexer, Koza-style symbolic regression
- Additional accuracy metrics for classification models (and comparison to baseline algorithms)
- Quantile Regression based on Gradient Boosted Trees
- Mathematica export for symbolic regression/classification solutions
- Improved complexity measures for multi-objective symbolic regression
- Improved persistence of data-analysis models (SVM, Gaussian Process, GBT, Random Forest)
- Hive Statistics: A new WebApp that shows information about running jobs and available resources in HeuristicLab Hive
Following our conference-oriented release schedule, we are happy to announce the release of HeuristicLab 3.3.12 "Madrid" from this year's Genetic and Evolutionary Computation Conference (GECCO).
HeuristicLab 3.3.12 "Madrid" contains the following new features:
- New problem: NK[P,Q] landscapes
- New problem: Orienteering
- New encoding: Linear linkage
- New data analysis algorithm: Gradient boosted trees
- New optimizer: TimeLimitRun limits algorithm execution by wall-clock time and can take snapshots at predefined points
- Integration of the Sim# simulation framework as external library (Sim# at GitHub)
- Hive status page is replaced by a modular WebApp
- Improved and searchable "New Item" dialogue
- C# code formatter for symbolic regression/classification
- The symbolic expression tree encoding can now be used with Programmable/Basic problems
- Kernel density estimation for the histogram control
As with every EuroCAST conference, the HeuristicLab development team is proud to announce the release of HeuristicLab 3.3.11 "Beach Bar".
Among others, HeuristicLab 3.3.11 "Beach Bar" contains the following new features:
- New algorithm: parameter-less population pyramid (P3). Thanks to Brian Goldman from the BEACON Center for the Study of Evolution in Action.
- New binary test problems:
- Deceptive trap problem
- Deceptive trap step problem
- HIFF problem
- New views for statistical testing and analysis of run collections
- New UI for C# scripts based on AvalonEdit
- New problem type: Programmable problem
- New APIs that make it easier to implement algorithms and problems
- Upgraded to .NET 4.5
For a full list of all changes in HeuristicLab 3.3.11 "Beach Bar" have a look at the ChangeLog.
Go to the Download page or click on the image below to get HeuristicLab 3.3.11 "Beach Bar" now!
With the GECCO conference just around the corner, it's time again for a new release! Therefore, the HeuristicLab development team is proud to announce the release of HeuristicLab 3.3.10 "Vancouver".
Among others, HeuristicLab 3.3.10 "Vancouver" contains the following new features:
- Scripting environment for rapid-prototyping (C# Script)
- New problems: External evaluation for Scilab and Matlab
- Grammatical Evolution
- Data Preprocessing
- Redesigned start page containing even more samples, including scripting samples
For a full list of all changes in HeuristicLab 3.3.10 "Vancouver" have a look at the ChangeLog.
Go to the Download page or click on the image below to get HeuristicLab 3.3.10 "Vancouver" now!
The HeuristicLab development team is proud to announce the release of HeuristicLab 3.3.9.
Among others, HeuristicLab 3.3.9 contains the following new features:
- improved Hive server performance
- improved GP interpreter performance
- export of symbolic regression/classification solutions to Excel
- a new problem for the optimization of trading rules
For a full list of all changes in HeuristicLab 3.3.9 have a look at the ChangeLog.
Go to the Download page or click on the image below to get HeuristicLab 3.3.9 now!
With the latest HeuristicLab version 3.3.8 we released an implementation of Gaussian process models for regression analysis. Our purely managed C# implementation is mainly based on the MATLAB implementation by Rasmussen and Nickisch accompanying the book "Gaussian Processes for Machine Learning" by Rasmussen and Williams (available online).
If you want to try Gaussian process regression in HeuristicLab, simply open the preconfigured sample. You can also import a CSV file with your own data.
The Gaussian process model can be viewed as a Bayesian prior distribution over functions and is related to Bayesian linear regression.
Samples from two different one-dimensional Gaussian processes:
Similarily to other models, such as the SVM, the GP model also uses the 'kernel-trick' to handle high-dimensional non-linear projections to feature space efficiently.
'Fitting' the model means to calculate the posterior Gaussian process distribution by conditioning the GP prior distribution on the observed data points in the training set. This leads to a posterior distribution in which functions that go through the observed training points are more likely. From the posterior GP distribution it is easily possible to calculate the posterior predictive distribution. So, instead of a simple point prediction for each test point it is possible to use the mean of the predictve distribution and calculate confidence intervals for the prediction at each test point.
The model is non-parametric and is fully specified via a mean function and a covariance function. The mean and covariance function often have hyper-parameters that have to be optimized to fit the model to a given training data set. For more information check out the book.
In HeuristicLab hyper-parameters of the mean and covariance functions are optimized w.r.t. the likelihood function (type-II ML) using the gradient-based BFGS algorithm. In the GUI you can observe the development of the likelihood and the values of the hyper-parameters over BFGS iterations. The output of the final Gaussian process model can also be visualized using a line chart that shows the mean prediction and the 95% confidence intervals.
Line chart of the negative log-likelihood:
Line chart of the optimized hyper-parameters:
Output of the model (mean and confidence interval):
We observed Gaussian process models often produce very accurate predictions, especially for low-dimensional data sets with up to 5000 training points. For larger data sets the computational effort becomes prohibitive (we have not yet implemented sparse approximations).
Just in time for our demo at GPTP 2013, the HeuristicLab development team released HeuristicLab 3.3.8.
Among others, HeuristicLab 3.3.8 contains the following new features:
- Scatter Search
- Relevant Alleles Preserving GA (RAPGA)
- Symbolic Time-Series Prognosis
- Neighborhood Component Analysis
- Ensemble Modeling
- Gaussian Process Regression and Least-Squares Classification
- Job Shop Scheduling
- Linux support based on Mono
For a full list of all changes in HeuristicLab 3.3.8 have a look at the ChangeLog.
Go to the Download page or click on the image below to get HeuristicLab 3.3.8 now!
With Heuristiclab 3.3.7, a new version of the vehicle routing plugin has been released. It includes several variants of this rich problem class that often occurs in practical applications dealing with transportation. The new plugin version 3.4 supports several variants and also provides standard benchmark instances known from the literature.
The supported variants are:
- Capacitated Vehicle Routing Problem (CVRP)
- Distance constrained CVRP (DCVRP)
- CVRP with time windows (CVRPTW)
- Multi-depot CVRP (MDCVRP)
- MDCVRP with time windows (MDCVRPTW)
- Pickup and delivery problem with time windows (PDPTW)
The architecture of the plugin is flexible so it can incorporate new variants which is planned in the future. Currently our work focuses on dynamic and stochastic VRPs. We will keep you posted on that and will release an addon for HeuristicLab soon that adds support for dynamic vehicle routing.
It is very easy to experiment with the different benchmark instances, since two standard algorithms are included, namely a genetic algorithm and also a tabu search. These algorithms work on all the included variants.
Download HeuristicLab 3.3.7 (http://dev.heuristiclab.com/trac/hl/core/wiki/Download) and give it a try!
Just in time for our tutorial at GECCO 2012, the HeuristicLab development team released HeuristicLab 3.3.7.
Among others, HeuristicLab 3.3.7 contains the following new features:
- new dialog to automatically create large experiments
- support for the optimization knowledge base (OKB)
- new and improved implementation of the Vehicle Routing Problem (VRP) which supports more VRP variants such as CVRP, DCVRP, CVRPTW, PDPTW, and MDCVRPTW
- lawn mower problem
- linear assignment problem and Hungarian algorithm
- benchmark problem instances: HeuristicLab now includes various libraries of published benchmark problem instances for combinatorial optimization problems (TSPLIB, QAPLIB, Taillard, Golden, Cordeau, Solomon, etc.) and regression/classification problems (Keijzer, Korns, Nguyen, real world problems, etc.)
For a full list of all changes in HeuristicLab 3.3.7 have a look at the ChangeLog.
Go to the Download page or click on the image below to get HeuristicLab 3.3.7 now!
We've included a new feature in the upcoming release of HeuristicLab 3.3.7 that will make it more comfortable to create parameter variation experiments.
Metaheuristics and data analysis methods often have a number of parameters which highly influence their behavior and thus the quality that you obtain in applying them on a certain problem. The best parameters are usually not known a priori and you can either use metaoptimization (available from the download page under "Additional packages") or create a set of experiments where each parameter is varied accordingly. In the upcoming release we've made this variation task a lot easier.
We have enhanced the "Create Experiments" dialog that is available through the Edit menu. To try out the new feature you can obtain the latest daily build from the Download page and load one of the samples. The dialog allows you to specify the values of several parameters and allows you to create an experiment where all configurations are enumerated.
We have also included the new problem instance infrastructure in this dialog which further allows you to test certain configurations on a number of benchmark instances from several benchmark libraries.
Finally, here are a couple of points that you should be aware of to make effective use of this feature. You can view this as a kind of checklist, before creating and executing experiments:
- Before creating an experiment make sure you prepare the algorithm accordingly, set all parameter that you do not want to vary to the value that you intend. If the algorithm contains any runs, clear them first.
- Review the selected analyzers carefully, maybe you want to exclude the quality chart and some other analyzers that would produce too much data for a large experiment. Or maybe you want to output the analyzers only every xth iteration.
- Make sure you check to include in the run only those problem and algorithm parameters that you need. Think twice before showing a parameter in the run that requires a lot of memory.
- Make sure SetSeedRandomly (if available) is set to true if you intend to repeat each configuration.
- When you make experiments with dependent parameters you have to resolve the dependencies and create separate experiments. For example, when you have one parameter that specifies a lower bound and another that specifies an upper bound you should create separate experiments for each lower bound so that you don't obtain configurations where the upper bound is lower than the lower bound.
- Finally, while you vary the parameters keep an eye on the number of variations. HeuristicLab doesn't prevent you from creating very large experiments, but if there are many variations you might want to create separate experiments.
I wanted to share a feature that we have included into our trunk yesterday and which will be part of the next release. It was our intention to aid people in benchmarking their algorithms. Usually in the literature a number of benchmark problem instances are known and shared. These instances allow to directly compare the performance of two algorithms. We have now included some of these benchmark libraries as plugins and make them directly available with HeuristicLab. Currently we have included TSPLIB, QAPLIB, and several others. Each library is a separate plugin that stores the files inside the assembly. It is very easy to add new benchmark libraries, as well as to adapt the problems to use them. A problem can make use of multiple, even different libraries.
We plan to expand this to include even more libraries so that you don't have to search the web for the files, but just load them in HeuristicLab. You can try the feature for yourself if you grab the latest daily build.
Happy new year! The HeuristicLab development team kicked off 2012 with the brand new HeuristicLab 3.3.6 release.
One of the most exciting new features of HeuristicLab 3.3.6 is HeuristicLab Hive which provides an infrastructure for parallel and distributed computing. Hive consists of a server and computation slaves. Users can upload jobs to the Hive server which distributes the jobs among the available slaves. The slaves execute the jobs and send the result back to the server after they are finished. More information about how to install a Hive server and Hive slaves and how to use Hive in general can be found in the Howtos?.
Additional new features in HeuristicLab 3.3.6 include:
- Robust Taboo Search for the Quadratic Assignment Problem
- New Standard Algorithms for Regression and Classification (kNN, Neural Networks, Multi-Nominal Logit Regression)
- Genetic Programming Grammar Editor
- New Standard Tree-Creation Operators for Genetic Programming (Grow, Full, Ramped Half-Half)
- RunCollectionModifiers to Combine and Transform Algorithm Results
- Improved Customization and Export of Charts
- Performance Benchmarks
For a full list of all changes in HeuristicLab 3.3.6 have a look at the ChangeLog.
Go to the Download page or click on the image below to get HeuristicLab 3.3.6 now!