Changes between Version 7 and Version 8 of Documentation/Reference/Particle Swarm Optimization
 Timestamp:
 03/31/11 23:03:09 (11 years ago)
Legend:
 Unmodified
 Added
 Removed
 Modified

Documentation/Reference/Particle Swarm Optimization
v7 v8 25 25 26 26 === Local vs. Global PSO === 27 In the (standard) global PSO, all the particles are neighbors of each other . Therefore the position of the global best particle is propagated to the whole swarm and affects the velocity update. If the current global optimum is not close to the best solution, it may become impossible for the swarm to explore other areas of the search space. Generally speaking, global PSOs usually converge faster and get trapped in local optima more easily.27 In the (standard) global PSO, all the particles are neighbors of each other (= fully connected topology). Therefore the position of the global best particle is propagated to the whole swarm and affects the velocity update. If the current global optimum is not close to the best solution, it may become impossible for the swarm to explore other areas of the search space. Generally speaking, global PSOs usually converge faster and get trapped in local optima more easily. 28 28 29 In local PSO variants, particles are grouped into neighborhoods according to a certain strategy. In this variant only the neighbor particles of a given particle can influence its velocity update. Consequently, local PSOs (with topologies) converge slower than global PSOs but are less likely to be captured in local minima due to greater population diversity. 29 In local PSO variants, particles are grouped into neighborhoods according to a certain strategy. In this variant only the neighbor particles of a given particle can influence its velocity update. Consequently, local PSOs (with topologies) converge slower than global PSOs but are less likely to be captured in local minima due to greater population diversity. An overview of the topologies available in HeuristicLab is depicted in the picture below: 30 30 31 So far, the following neighborhood topologies have been implemented: 32 * `RandomTopologyInitializer`: Randomly connectes every particle with `NroOfParticles` other particles. Neighborhood connections are not symmetric, meaning if particle A is a neighbor of particle B, particle B does not necessarily have to be a neighbor of particle A. The default `NroOfParticles` is 3. 31 [[Image(PSO_Topologies.png, height=200px, marginright=30, marginleft=30)]] 32 33 The fully connected topology is the standard configuration (= global PSO). If you wish to use another topogly you have to set the `TopologyInitializer` parameter to one of the following: 34 * `RandomTopologyInitializer`: Randomly connectes every particle with `NroOfParticles` other particles. Neighborhood connections are not symmetric, meaning if particle A is a neighbor of particle B, particle B does not necessarily have to be a neighbor of particle A. The default `NroOfParticles` is 3. The picture above shows a random neighborhood with `NroOfParticles`=2. 33 35 * `RingTopologyInitializer`: Every particle is connected with its preceeding and its following particle. 34 * `VonNeumannTopologyInitializer`: Every particle is connected with the two following and the two previous particles, wrapping around at the beginning and the end of the population.36 * `VonNeumannTopologyInitializer`: Every particle is connected with 4 other particles, the two following and the two previous, wrapping around at the beginning and the end of the population. 35 37 36 38 [#References (Kennedy and Mendes, 2002)] and  in more detail  [#References (Mendes, 2004)] investigated the impact of different topologies on algorithm performance. They found that the global PSO was quick to converge but yielded the worst results. Circular and weel topologies improved the results moderately and using the Von Neumann topology produed the best results. … … 52 54 53 55 54 Also, a recent paper by [#References (Pedersen 2010)] provides a most helpful table of PSO parameter settings that have been tuned via meta/optimization for different optimization scenarios. We recommend them as a first starting point when optimizing new problems. Some of the settings (like using a negative inertia weight ) may seem quirky, but we also got some very good results with those settings.56 Also, a recent paper by [#References (Pedersen 2010)] provides a most helpful table of PSO parameter settings that have been tuned via meta/optimization for different optimization scenarios. We recommend them as a first starting point when optimizing new problems. Some of the settings (like using a negative inertia weight and personal attraction weight) may seem quirky, but we also got some very good results with those settings. 55 57 56 58 === Dynamic Parameters ===