SkillAgentSearch skills...

Geneticalgorithm

Genetic Algorithm Package for Python

Install / Use

/learn @rmsolgi/Geneticalgorithm
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

geneticalgorithm

geneticalgorithm is a Python library distributed on Pypi for implementing standard and elitist genetic-algorithm (GA). This package solves continuous, combinatorial and mixed optimization problems with continuous, discrete, and mixed variables. It provides an easy implementation of genetic-algorithm (GA) in Python.

Installation

Use the package manager pip to install geneticalgorithm in Python.

pip install geneticalgorithm

version 1.0.2 updates

@param convergence_curve <True/False> - Plot the convergence curve or not. Default is True.

@param progress_bar <True/False> - Show progress bar or not. Default is True.

A simple example

Assume we want to find a set of X=(x1,x2,x3) that minimizes function f(X)=x1+x2+x3 where X can be any real number in [0,10].
This is a trivial problem and we already know that the answer is X=(0,0,0) where f(X)=0.
We just use this simple example to see how to implement geneticalgorithm:

First we import geneticalgorithm and numpy. Next, we define function f which we want to minimize and the boundaries of the decision variables; Then simply geneticalgorithm is called to solve the defined optimization problem as follows:

import numpy as np
from geneticalgorithm import geneticalgorithm as ga

def f(X):
    return np.sum(X)
    
    
varbound=np.array([[0,10]]*3)

model=ga(function=f,dimension=3,variable_type='real',variable_boundaries=varbound)

model.run()

Notice that we define the function f so that its output is the objective function we want to minimize where the input is the set of X (decision variables). The boundaries for variables must be defined as a numpy array and for each variable we need a separate boundary. Here I have three variables and all of them have the same boundaries (For the case the boundaries are different see the example with mixed variables).

geneticalgorithm has some arguments:
Obviously the first argument is the function f we already defined (for more details about the argument and output see Function).
Our problem has three variables so we set dimension equal three.
Variables are real (continuous) so we use string 'real' to notify the type of variables (geneticalgorithm accepts other types including Boolean, Integers and Mixed; see other examples).
Finally, we input varbound which includes the boundaries of the variables. Note that the length of variable_boundaries must be equal to dimension.

If you run the code, you should see a progress bar that shows the progress of the genetic algorithm (GA) and then the solution, objective function value and the convergence curve as follows:

Also we can access to the best answer of the defined optimization problem found by geneticalgorithm as a dictionary and a report of the progress of the genetic algorithm. To do so we complete the code as follows:

convergence=model.report
solution=model.ouput_dict

output_dict is a dictionary including the best set of variables found and the value of the given function associated to it ({'variable': , 'function': }). report is a list including the convergence of the algorithm over iterations

The simple example with integer variables

Considering the problem given in the simple example above. Now assume all variables are integers. So x1, x2, x3 can be any integers in [0,10]. In this case the code is as the following:

import numpy as np
from geneticalgorithm import geneticalgorithm as ga

def f(X):
    return np.sum(X)
    
    
varbound=np.array([[0,10]]*3)

model=ga(function=f,dimension=3,variable_type='int',variable_boundaries=varbound)

model.run()


So, as it is seen the only difference is that for variable_type we use string 'int'.

The simple example with Boolean variables

Considering the problem given in the simple example above. Now assume all variables are Boolean instead of real or integer. So X can be either zero or one. Also instead of three let's have 30 variables. In this case the code is as the following:

import numpy as np
from geneticalgorithm import geneticalgorithm as ga

def f(X):
    return np.sum(X)
    

model=ga(function=f,dimension=30,variable_type='bool')

model.run()


Note for variable_type we use string 'bool' when all variables are Boolean.
Note that when variable_type equal 'bool' there is no need for variable_boundaries to be defined.

The simple example with mixed variables

Considering the problem given in the the simple example above where we want to minimize f(X)=x1+x2+x3. Now assume x1 is a real (continuous) variable in [0.5,1.5], x2 is an integer variable in [1,100], and x3 is a Boolean variable that can be either zero or one. We already know that the answer is X=(0.5,1,0) where f(X)=1.5 We implement geneticalgorithm as the following:

import numpy as np
from geneticalgorithm import geneticalgorithm as ga

def f(X):
    return np.sum(X)
    
varbound=np.array([[0.5,1.5],[1,100],[0,1]])
vartype=np.array([['real'],['int'],['int']])
model=ga(function=f,dimension=3,variable_type_mixed=vartype,variable_boundaries=varbound)

model.run()

Note that for mixed variables we need to define boundaries also we need to make a numpy array of variable types as above (vartype). Obviously the order of variables in both arrays must match. Also notice that in such a case for Boolean variables we use string 'int' and boundary [0,1].
Notice that we use argument variable_type_mixed to input a numpy array of variable types for functions with mixed variables.

Maximization problems

geneticalgorithm is designed to minimize the given function. A simple trick to solve maximization problems is to multiply the objective function by a negative sign. Then the absolute value of the output is the maximum of the function. Consider the above simple example. Now lets find the maximum of f(X)=x1+x2+x3 where X is a set of real variables in [0,10]. We already know that the answer is X=(10,10,10) where f(X)=30.

import numpy as np
from geneticalgorithm import geneticalgorithm as ga

def f(X):
    return -np.sum(X)
    
varbound=np.array([[0,10]]*3)

model=ga(function=f,dimension=3,variable_type='real',variable_boundaries=varbound)

model.run()

As seen above np.sum(X) is mulitplied by a negative sign.

Optimization problems with constraints

In all above examples, the optimization problem was unconstrained. Now consider that we want to minimize f(X)=x1+x2+x3 where X is a set of real variables in [0,10]. Also we have an extra constraint so that sum of x1 and x2 is equal or greater than 2. The minimum of f(X) is 2. In such a case, a trick is to define penalty function. Hence we use the code below:

import numpy as np
from geneticalgorithm import geneticalgorithm as ga

def f(X):
    pen=0
    if X[0]+X[1]<2:
        pen=500+1000*(2-X[0]-X[1])
    return np.sum(X)+pen
    
varbound=np.array([[0,10]]*3)

model=ga(function=f,dimension=3,variable_type='real',variable_boundaries=varbound)

model.run()

As seen above we add a penalty to the objective function whenever the constraint is not met.

Some hints about how to define a penalty function:
1- Usually you may use a constant greater than the maximum possible value of the objective function if the maximum is known or if we have a guess of that. Here the highest possible value of our function is 300 (i.e. if all variables were 10, f(X)=300). So I chose a constant of 500. So, if a trial solution is not in the feasible region even though its objective function may be small, the penalized objective function (fitness function) is worse than any feasible solution. 2- Use a coefficient big enough and multiply that by the amount of violation. This helps the algorithm learn how to approach feasible domain. 3- How to define penalty function usually influences the convergence rate of an evolutionary algorithm. In my book on metaheuristics and evolutionary algorithms you can learn more about that. 4- Finally after you solved the problem test the solution to see if boundaries are met. If the solution does not meet constraints, it shows that a bigger penalty is required. However, in problems where optimum is exactly on the boundary of the feasible region (or very close to the constraints) which is common in some kinds of problems, a very strict and big penalty may prevent the genetic algorithm to approach the optimal region. In such a case designing an appropriate penalty function might be more challenging. Actually what we have to do is to design a penalty function that let the algorithm searches unfeasible domain while finally converge to a feasible solution. Hence you may need more sophisticated penalty functions. But in most cases the above formulation work fairly well.

Genetic algorithm's parameters

Every evolutionary algorithm (metaheuristic) has some parameters to be adjusted. Genetic algorithm also has some parameters. The parameters of geneticalgorithm is defined as a dictionary:


algorithm_param = {'max_num_iteration': None,\
                   'population_size':100,\
                   'mutation_probability':0.1,\
                   'elit_ratio': 0.01,\
                   'crossover_probability': 0.5,\
                   'parents_portion': 0.3,\
                   'crossover_type':'uniform',\
                

Related Skills

View on GitHub
GitHub Stars271
CategoryDevelopment
Updated2d ago
Forks77

Languages

Python

Security Score

85/100

Audited on Mar 24, 2026

No findings