A Tutorial on Particle Swarm Optimization in Python

A metaheuristic algorithm suitable for optimizing nonlinear continuous functions.

There are several approaches that can be taken to maximize or minimize a function to find the optimal value. Despite the fact that there are several optimization approaches that can be used, no single one is thought to be ideal for every given case. Hence, each of the optimization approaches has its own advantages and limitations. Particle Swarm Optimization (PSO) is also an optimization technique belonging to the field of nature-inspired computing. It is an algorithm that searches for the best solution in space in a straightforward way. This is the method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. In this article, we will discuss Particle Swarm Optimization in detail along with its working and different variants. We will also learn the hands-on implementation of PSO using the python package PySwarms. We will cover the following major points in this article.

Table of Contents

  1. Particle Swarm Optimization (PSO)
  2. Inner working
  3. Variants of PSO
  4. Implementing PSO using PySwarms

Let’s start the discussion by understanding the Particle Swarm Optimization (PSO) algorithm.

Particle Swarm Optimization (PSO)

Several studies on the social behaviour of animal groups were developed in the early 1990s. These investigations revealed that some creatures in a specific group, namely birds and fishes, are able to transmit knowledge among themselves and that this ability provides these animals with a significant survival benefit. Inspired by this research, Kennedy and Eberhart presented the PSO algorithm in 1995, a metaheuristic algorithm suitable for optimizing nonlinear continuous functions. The algorithm was inspired by the concept of swarm intelligence, which is commonly exhibited in animal groupings such as flocks and shoals.

Subscribe to our Newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

As stated in the original study, fish or a flock of birds moving in a group “may benefit from the experience of all other members.” In other words, if a bird is flying around randomly looking for food, all of the birds in the flock can share their discoveries and help the entire flock get the best hunt. While we may imitate the movement of a flock of birds, we can also assume that each bird is assisting us in finding the optimal solution in a high-dimensional solution space, with the best solution found by the flock being the best solution in the space.

Inner working of PSO

Researchers believe that swarm behaviour differs between exploratory behaviour (searching a larger part of the search space) and exploitative behaviour seeking a smaller region of the search space to get closer to a (potentially local) optimum. Since PSO’s inception, according to researchers, the PSO algorithm and its parameters must be designed to strike an appropriate balance between exploration and exploitation in order to avoid early convergence to a local optimum while ensuring a good rate of convergence to the optimum.


In PSO convergence, Regardless of how the swarm operates, convergence to a local optimum occurs when all personal bests P or, alternatively, the swarm’s best-known position G approaches a local optimum of the problem.

The ability of a PSO algorithm to explore and exploit may be affected by its topology structure; that is, with a different structure, the algorithm’s convergence speed and ability to avoid premature convergence on the same optimization problem will be different because a topology structure determines the search information sharing speed or direction for each particle. The two most common topological structures are the global star and the local ring.

 A PSO with a global star structure, in which all particles are connected, has the shortest average distance in the swarm, while a PSO with a local ring structure, in which every particle is connected to two nearby particles, has the highest average distance in the swarm.

The experimental study examines two commonly utilized architectures, the global star structure (figure 1a) and the local ring structure (figure 1b). There are 16 particles in each group. It should be emphasized that the closest particle in the local structure is primarily determined by the particle index.

Adaptive Mechanism 

An adaptive mechanism can be implemented without the necessity for a trade-off between convergence (‘exploitation’) and divergence (‘exploration’). Adaptive particle swarm optimization (APSO) outperforms regular particle swarm optimization (PSO). With a faster convergence time, APSO can execute global searches across the entire search space. 

It allows for real-time modification of the inertia weight, acceleration coefficients, and other computational factors, resulting in increased search efficacy and efficiency. APSO can also operate on the best particle globally to jump out of the most likely local optima.

Variants of PSO

Even a simple PSO algorithm can have a lot of different variations. There are various ways to initialize the particles and velocities (for example, start with zero velocities), only update Pi and G after the entire swarm has been updated, and so on.

Gradient PSO

To construct gradient-based PSO algorithms, the ability of the PSO algorithm to efficiently explore many local minimums can be combined with the ability of gradient-based local search algorithms to effectively calculate an accurate local minimum. 

The PSO algorithm is used in gradient-based PSO algorithms to explore several local minima and discover a location in the basin of attraction of a deep local minimum. The deep local minimum is then properly located using efficient gradient-based local search techniques.

Hybrid PSO

In order to increase optimization performance, new and more advanced PSO variations are being introduced. There are certain developments in that study, such as developing a hybrid optimization approach that combines PSO with other optimizers, such as combining PSO with biogeography-based optimization and including an effective learning mechanism.

Implementing Particle Swarm Optimization using PySpwarms

PySwarms is a Python-based tool for particle swarm optimization. It is used by swarm intelligence researchers, practitioners, and students who want to use a declarative high-level interface to apply PSO to their issues. PySwarms offers interaction with swarm optimizations and basic optimization with PSO.

PySwarms implements many-particle swarm optimization techniques at a high level. As a result, it aspires to be user-friendly and adaptable. Supporting modules can also be employed to assist you with your optimization problem.

In this section, we will implement the global-best optimizer using PySwarms’s functional API pyswarms.single.GBestPSO. We will plot the functions in the 2D and 3D manner as well.

PySwarm can be straightway installed by pip install pyswarms 

Optimizing the Function of Sphere
# Import PySwarms
import pyswarms as ps
from pyswarms.utils.functions import single_obj as fx

We’ll work on improving the sphere function. Let’s put some arbitrary settings in our optimizers for now. At the very least, there are three steps to optimization:

  • To configure the swarm as a dict, set the hyperparameters.
  • Pass the dictionary along with the relevant inputs to create an instance of the optimizer.
  • Invoke the optimize() method, and tell it to save the best cost and position in a variable.
# Set-up hyperparameters
options = {'c1': 0.5, 'c2': 0.3, 'w':0.9}
# Call instance of PSO
optimizer = ps.single.GlobalBestPSO(n_particles=10, dimensions=2, options=options)
# Perform optimization
cost, pos = optimizer.optimize(fx.sphere, iters=1000)

This will run the optimizer for 1000 iterations before returning the swarm’s best cost and position.

Visualizing the Function

PySwarms includes tools for visualizing your swarm’s behaviour. These are constructed on top of matplotlib, resulting in user-friendly and highly customizable charts. The plotters module has two animation methods: plot contour() and plot surface(). These approaches, as the name implies, plot the particles in a 2-D or 3-D space.

In order to plot the sphere function, we should add meshes to our swarm. This allows us to see where the particles are in relation to our objective function graphically. Using the Mesher class, we can achieve this.

import matplotlib.pyplot as plt
from pyswarms.utils.plotters import plot_contour, plot_surface
from pyswarms.utils.plotters.formatters import Designer
from pyswarms.utils.plotters.formatters import Mesher

The pyswarms.utils.plotters.formatters module contains many formatters for customizing your plots and visualizations. Aside from Mesher, there’s a Designer class for modifying font sizes, figure sizes, and so on, as well as an Animator class for setting animation delays and repeats.

2-D Plot
m = Mesher(func=fx.sphere)
# Make animation
animation = plot_contour(pos_history=optimizer.pos_history,
                         mark=(0,0)) # Mark minima
3-D Plot
# preprocessing
pos_history_3d = m.compute_history_3d(optimizer.pos_history)
# adjust the figure
d = Designer(limits=[(-1,1), (-1,1), (-0.1,1)], label=['x-axis', 'y-axis', 'z-axis'])
# Make animation
animation3d = plot_surface(pos_history=pos_history_3d, 
                           mesher=m, designer=d,       
                           mark=(0,0,0))  # Mark minima

Final Words

In this post, we have seen the theory behind the PSO by knowing how its inner mechanism is working. Also, we have seen a few of its variants, nothing but how PSO has been used in the different domains by the community. And lastly, we have taken a hands-on experience on PSO by leveraging the python-based PySwarms library.      


Vijaysinh Lendave
Vijaysinh is an enthusiast in machine learning and deep learning. He is skilled in ML algorithms, data manipulation, handling and visualization, model building.

Download our Mobile App


AI Hackathons, Coding & Learning

Host Hackathons & Recruit Great Data Talent!

AIM Research

Pioneering advanced AI market research

Request Customised Insights & Surveys for the AI Industry


Strengthen Critical AI Skills with Trusted Corporate AI Training

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

AIM Leaders Council

World’s Biggest Community Exclusively For Senior Executives In Data Science And Analytics.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox