Nature Inspired Metaheuristics for Optimization
Samarth Gupta
Unknown
 0 Collaborators
This project deploys Particle Swarm Optimization (Swarm Intelligence) and Genetic Algorithm (Evolutionary Intelligence) in Python to a functional optimization, knapsack, and text decoding problem. ...learn more
Project status: Under Development
Groups
Student Developers for AI
Intel Technologies
Intel Python
Overview / Usage
This project uses Particle Swarm Optimization(PSO) to optimize the minimum value of a function of 10 variables. PSO is a swarm intelligence based metaheuristic that searches for optima by updating generations of random particles according to a local and global best value after each iteration.
We also use a class of evolutionary algorithm called **Genetic Algorithm **to solve the famous knapsack problem and to decode a particular text pattern. Both these problems have many applications primarily in fields of combinatorics, computer science, complexity theory, cryptography, applied mathematics, and even in daily fantasy sports.
Methodology / Approach
**Working of Genetic Algorithm: **They use the following operators successively:

Selection: A pool of individuals from the population is selected after every successive generation to serve as the breeders for the upcoming generation. They are usually selected based on a fitness test in which only the ones who have fitness value above a certain threshold (required) value survive.

Cross Over: This biological operator is used for breeding of parents to produce successive generations of ‘fit’ children. There are many crossing over techniques such as single point crossover, uniform crossover, etc. In this project single point crossover technique has been used.

Mutation Mutation is a genetic operator used to maintain genetic diversity from one generation of a population to the next. It is used so that the population is not struck at a local minima and can achieve global optimization. There are many mutation techniques such as swap mutation, flip mutation, etc. we have used flip mutation here.
** Working of Particle Swarm Optimization(PSO): **
PSO is initialized with a group of random particles (solutions) and then searches for optima by updating generations. In every iteration, each particle is updated by following two "best" values. The first one is the best solution (fitness) it has achieved so far. (The fitness value is also stored.) This value is called pbest. Another "best" value that is tracked by the particle swarm optimizer is the best value, obtained so far by any particle in the population. This best value is a global best and called gbest. When a particle takes part of the population as its topological neighbors, the best value is a local best and is called lbest. After finding the two best values, the particle updates its velocity and positions iteratively.
Technologies Used
 Intel Python
 Numpy
 Matplotlib
 Random
Repository
https://github.com/guptasamarth61/OptimizationMetaheuristics