Changes between Version 3 and Version 4 of Documentation/Reference/Particle Swarm Optimization


Ignore:
Timestamp:
03/31/11 13:42:09 (11 years ago)
Author:
mkofler
Comment:

Improved PSO wiki page

Legend:

Unmodified
Added
Removed
Modified
  • Documentation/Reference/Particle Swarm Optimization

    v3 v4  
    11= Particle Swarm Optimization =
    2 Particle swarm optimization (PSO) is a stochastic population-based optimization algorithm that was first introduced by Kennedy and Eberhart. Members of the population are called particles. Each particle moves in search space, taking advantage of the particle’s own experience and the experience of the particle’s neighbours or the experience of the whole swarm.
     2Particle swarm optimization (PSO) is a stochastic population-based optimization algorithm that was first introduced by Kennedy and Eberhart. Members of the population (= swarm) are called particles. Each particle moves around in the search space, taking advantage of the particle’s own experience and the experience of the particle’s neighbours or of the whole swarm to guide the search. A great introduction to PSOs can be found in [#References (Kennedy and Eberhart, 2001)].
    33
    44Our particle swarm optimization algorithm has been designed to work with arbitrary solution encodings. However, so far a concrete implementation is only available for real vector encoded problems, i.e. [[Single Objective Test Function]].
     
    2525
    2626=== Local vs. Global PSO ===
    27 Topology initializers group the particles into neighborhoods according to a certain strategy.
     27In the (standard) global PSO, all the particles are neighbors of each other. Therefore the position of the global best particle is propagated to the whole swarm and affects the velocity update. If the current global optimum is not close to the best solution, it may become impossible for the swarm to explore other areas of the search space. Generally speaking, global PSOs usually converge faster and get trapped in local optima more easily.
    2828
    29 So far, the following topologies have been implemented:
     29In local PSO variants, particles are grouped into neighborhoods according to a certain strategy. In this variant only the neighbor particles of a given particle can influence its velocity update. Consequently, local PSOs (with topologies) converge slower than global PSOs but are less likely to be captured in local minima due to greater population diversity.
     30
     31So far, the following neighborhood topologies have been implemented:
    3032* `RandomTopologyInitializer`: Randomly connectes every particle with `NroOfParticles` other particles. Neighborhood connections are not symmetric, meaning if particle A is a neighbor of particle B, particle B does not necessarily have to be a neighbor of particle A. The default `NroOfParticles` is 3.
    3133* `RingTopologyInitializer`: Every particle is connected with its preceeding and its following particle.
    3234* `VonNeumannTopologyInitializer`: Every particle is connected with the two following and the two previous particles, wrapping around at the beginning and the end of the population.
    3335
    34 If you want to implement your own topology, you must inherit from `ITopologyInitializer` or derive from the base class `TopologyInitializer`.
     36[#References (Kennedy and Mendes, 2002)] investigated the impact of different topologies on algorithm performance. They found that the global PSO was quick to converge but yielded the worst results. Circular and weel topologies improved the results moderately and using the Von Neumann topology produed the best results.
     37
     38
     39The topologies are static and do not change during algorithm execution, once they have been initialized. If you want to implement your own topology, you must inherit from `ITopologyInitializer` or derive from the base class `TopologyInitializer`.
    3540 
    36 In general local PSOs (with topologies) converge slower than global PSOs but are less likely to be captured in local minima due to greater population diversity. (Kennedy and Mendes, 2002) investigated the impact of different topologies on algorithm performance. The found that:
    37 * Global PSO: quick to converge, worst results
    38 * Circular Topology: moderate results
    39 * Wheel Topology: moderate results
    40 * Von Neumann Topology: best results
    4141
    4242=== Parameter Adjustment ===
     
    4646'''Parameter Tuning  '''
    4747
    48 A recent paper by (Pedersen 2010) provides a most helpful table of PSO parameters that have been tuned for different optimization scenarios. We recommend them as a first starting point when optimizing new problems. Some of the settings (like using a negative inertia weight) may seem quirky, but we also got some very good results with those settings.
     48A recent paper by [#References (Pedersen 2010)] provides a most helpful table of PSO parameters that have been tuned for different optimization scenarios. We recommend them as a first starting point when optimizing new problems. Some of the settings (like using a negative inertia weight) may seem quirky, but we also got some very good results with those settings.
    4949
    5050'''Velocity Bounds'''
     
    6161
    6262'''Topology Updaters'''
    63 * `MultiPSOTopologyUpdater`: The whole population is divided into `NrOfSwarms` non-overlapping sub-swarms. Neighborhood topology is dynamic and randomly assigned. Swarms are re-grouped every regroupingPeriod iteration. The operator is implemented as described in (Liang and Suganthan 2005).
     63* `MultiPSOTopologyUpdater`: The whole population is divided into `NrOfSwarms` non-overlapping sub-swarms. Neighborhood topology is dynamic and randomly assigned. Swarms are re-grouped every regroupingPeriod iteration. The operator is implemented as described in [#References (Liang and Suganthan 2005)].
    6464
    6565=== Is there a sample/tutorial? ===
    6666
    67 We are currently preparing one. Please stay tuned.
     67A global PSO with dynamically updated inertia and velocity bounds has been implemented for the Schwefel test function. Details and parameter settings:
    6868
     69
     70[UsersSamples#GPAA Particle Swarm Optimization + Schwefel Test Function]
     71
     72[=#References]
    6973=== References: ===
    7074* Kennedy, J. and  Eberhart, R.C., 2001. Swarm Intelligence. Morgan Kaufmann. ISBN 1-55860-595-9.