Lorem ipsum dolor sit amet, consectetur adicing elit ut ullamcorper. leo, eget euismod orci. Cum sociis natoque penati bus et magnis dis.Proin gravida nibh vel velit auctor aliquet. Leo, eget euismod orci. Cum sociis natoque penati bus et magnis dis.Proin gravida nibh vel velit auctor aliquet.

  /  Project   /  Blog: Genetic Algorithms

Blog: Genetic Algorithms

A genetic algorithm is a type of evolutionary algorithm. So, in order to understand genetic algorithms, we need to discuss evolutionary algorithms.

Go to the profile of Dammnn

An evolutionary algorithm is a meta-heuristic optimization algorithm that applies the principles of evolution to solve problems. The concept of evolution is similar to the one we find in nature. We directly use the problem’s functions and variables to arrive at the solution. But in a genetic algorithm, any given problem is encoded in bit patterns that are manipulated by the algorithm.

The underlying idea in all evolutionary algorithms is that we take a population of individuals and apply the natural selection process. We start with a set of randomly selected individuals and then identify the strongest among them. The strength of each individual is determined using a fitness function that’s predefined. In a way, we use the survival of the fittest approach.

We then take these selected individuals and create the next generation of individuals by recombination and mutation. We will discuss the concepts of recombination and mutation in the next section. For now, let’s think of these techniques as mechanisms to create the next generation by treating the selected individuals as parents.

Once we execute recombination and mutation, we create a new set of individuals who will compete with the old ones for a place in the next generation. By discarding the weakest individuals and replacing them with offspring, we are increasing the overall fitness level of the population. We continue to iterate until the desired overall fitness is achieved.

A genetic algorithm is an evolutionary algorithm where we use a heuristic to find a string of bits that solves a problem. We continuously iterate on a population to arrive at a solution. We do this by generating new populations containing stronger individuals. We apply probabilistic operators such as selection, crossover, and mutation in order to generate the next generation of individuals. The individuals are basically strings, where every string is the encoded version of a potential solution.

A fitness function is used that evaluates the fitness measure of each string telling us how well suited it is to solve this problem. This fitness function is also referred to as an evaluation function. Genetic algorithms apply operators that are inspired by nature, which is why the nomenclature is closely related to the terms found in biology.

Fundamental concepts in genetic algorithms

In order to build a genetic algorithm, we need to understand several key concepts and terminology. These concepts are used extensively throughout the field of genetic algorithms to build solutions to various problems. One of the most important aspects of genetic algorithms is randomness. In order to iterate, it relies on the random sampling of individuals. This means that the process is non-deterministic. So, if you run the same algorithm multiple times, you might end up with different solutions.

Let’s talk about population. A population is a set of individuals that are possible candidate solutions. In a genetic algorithm, we do not maintain a single best solution at any given stage. It maintains a set of potential solutions, one of which is the best. But the other solutions play an important role during the search. Since we have a population of solutions, it is less likely that will get stuck in a local optimum. Getting stuck in the local optimum is a classic problem faced by other optimization techniques.

Now that we know about population and the stochastic nature of genetic algorithms, let’s talk about the operators. In order to create the next generation of individuals, we need to make sure that they come from the strongest individuals in the current generation. The mutation is one of the ways to do it. A genetic algorithm makes random changes to one or more individuals of the current generation to yield a new candidate solution. This change is called mutation. Now this change might make that individual better or worse than existing individuals.

The next concept here is recombination, which is also called crossover. This is directly related to the role of reproduction in the evolution process. A genetic algorithm tries to combine individuals from the current generation to create a new solution. It combines some of the features of each parent individual to create this offspring. This process is called crossover. The goal is to replace the weaker individuals in the current generation with offspring generated from stronger individuals in the population.

In order to apply crossover and mutation, we need to have selection criteria. The concept of selection is inspired by the theory of natural selection. During each iteration, the genetic algorithm performs a selection process. The strongest individuals are chosen using this selection process and the weaker individuals are terminated. This is where the survival of the fittest concept comes into play. The selection process is carried out using a fitness function that computes the strength of each individual.

Generating a bit pattern with predefined parameters

Now that we know how a genetic algorithm works, let’s see how to use it to solve some problems. We will be using a Python package called DEAP. You can find all the details about it at http://bit.ly/2PTutiH. Let’s go ahead and install it by running the following command on your Terminal:

$ pip3 install deap

Now that the package is installed, let’s quickly test it. Go into the Python shell by typing the following on your Terminal:

$ python3

Once you are inside, type the following:

>>> import deap

If you do not see an error message, we are good to go.

In this section, we will solve a variant of the One Max problem. The One Max problem is about generating a bit string that contains the maximum number of ones. It is a simple problem, but it’s very helpful in getting familiar with the library as well as understanding how to implement solutions using genetic algorithms. In our case, we will try to generate a bit string that contains a predefined number of ones. You will see that the underlying structure and part of the code is similar to the example used in the DEAP library.

Create a new Python file and import the following:

import random 

from deap import base, creator, tools

Let’s say we want to generate a bit pattern of length 75, and we want it to contain 45 ones. We need to define an evaluation function that can be used to target this objective:

# Evaluation function 
def eval_func(individual):
target_sum = 45
return len(individual) - abs(sum(individual) - target_sum),

If you look at the formula used in the preceding function, you can see that it reaches its maximum value when the number of ones is equal to 45. The length of each individual is 75. When the number of ones is equal to 45, the return value would be 75.

We now need to define a function to create the toolbox. Let’s define a creatorobject for the fitness function and to keep track of the individuals. The Fitnessclass used here is an abstract class and it needs the weights attribute to be defined. We are building a maximizing fitness using positive weights:

# Create the toolbox with the right parameters 
def create_toolbox(num_bits):
creator.create("FitnessMax", base.Fitness, weights=(1.0,))
creator.create("Individual", list, fitness=creator.FitnessMax)

The first line creates a single objective maximizing fitness named FitnessMax. The second line deals with producing the individual. In a given process, the first individual that is created is a list of floats. In order to produce this individual, we must create an Individual class using the creator. The fitness attribute will use FitnessMax defined earlier.

A toolbox is an object that is commonly used in DEAP It is used to store various functions along with their arguments. Let’s create this object:

# Initialize the toolbox 
toolbox = base.Toolbox()

We will now start registering various functions to this toolbox. Let’s start with the random number generator that generates a random integer between 0 and 1. This is basically to generate the bit strings:

# Generate attributes 
toolbox.register("attr_bool", random.randint, 0, 1)

Let’s register the individual function. The method initRepeat takes three arguments – a container class for the individual, a function used to fill the container, and the number of times we want the function to repeat itself:

# Initialize structures 
toolbox.register("individual", tools.initRepeat, creator.Individual,
toolbox.attr_bool, num_bits)

We need to register the population function. We want the population to be a list of individuals:

# Define the population to be a list of individuals 
toolbox.register("population", tools.initRepeat, list, toolbox.individual)

We now need to register the genetic operators. Register the evaluation function that we defined earlier, which will act as our fitness function. We want the individual, which is a bit pattern, to have 45 ones:

# Register the evaluation operator 
toolbox.register("evaluate", eval_func)

Register the crossover operator named mate using the cxTwoPoint method:

# Register the crossover operator 
toolbox.register("mate", tools.cxTwoPoint)

Register the mutation operator named mutate using mutFlipBit. We need to specify the probability of each attribute to be mutated using indpb:

# Register a mutation operator 
toolbox.register("mutate", tools.mutFlipBit, indpb=0.05)

Register the selection operator using selTournament. It specifies which individuals will be selected for breeding:

# Operator for selecting individuals for breeding 
toolbox.register("select", tools.selTournament, tournsize=3)

return toolbox

This is basically the implementation of all the concepts we discussed in the preceding section. A toolbox generator function is very common in DEAP and we will use it throughout this article. So it’s important to spend some time to understand how we generated this toolbox.

Define the main function by starting with the length of the bit pattern:

if __name__ == "__main__": 
# Define the number of bits
num_bits = 75

Create a toolbox using the function we defined earlier:

# Create a toolbox using the above parameter 
toolbox = create_toolbox(num_bits)

We need to seed the random number generator so that we get repeatable results:

# Seed the random number generator 

Create an initial population of, say, 500 individuals using the method available in the toolbox object. Feel free to change this number and experiment with it:

# Create an initial population of 500 individuals 
population = toolbox.population(n=500)

Define the probabilities of crossing and mutating. Again, these are parameters that are defined by the user. So you can change these parameters and see how they affect the result:

# Define probabilities of crossing and mutating 
probab_crossing, probab_mutating = 0.5, 0.2

Define the number of generations that we need to iterate until the process is terminated. If you increase the number of generations, you are giving it more freedom to improve the strength of the population:

# Define the number of generations 
num_generations = 60

Evaluate all the individuals in the population using the fitness functions:

print('\nStarting the evolution process') 

# Evaluate the entire population
fitnesses = list(map(toolbox.evaluate, population))
for ind, fit in zip(population, fitnesses):
ind.fitness.values = fit

Start iterating through the generations:

print('\nEvaluated', len(population), 'individuals') 

# Iterate through generations
for g in range(num_generations):
print("\n===== Generation", g)

In each generation, select the next generation individuals using the selection operator that we registered to the toolbox earlier:

# Select the next generation individuals 
offspring = toolbox.select(population, len(population))

Clone the selected individuals:

# Clone the selected individuals 
offspring = list(map(toolbox.clone, offspring))

Apply crossover and mutation on the next generation individuals using the probability values defined earlier. Once it’s done, we need to reset the fitness values:

# Apply crossover and mutation on the offspring 
for child1, child2 in zip(offspring[::2], offspring[1::2]):
# Cross two individuals
if random.random() < probab_crossing:
toolbox.mate(child1, child2)

# "Forget" the fitness values of the children
del child1.fitness.values
del child2.fitness.values

Apply mutation to the next generation individuals using the corresponding probability value that we defined earlier. Once it’s done, reset the fitness value:

# Apply mutation 
for mutant in offspring:
# Mutate an individual
if random.random() < probab_mutating:
del mutant.fitness.values

Evaluate the individuals with invalid fitness values:

# Evaluate the individuals with an invalid fitness 
invalid_ind = [ind for ind in offspring if not ind.fitness.valid]
fitnesses = map(toolbox.evaluate, invalid_ind)
for ind, fit in zip(invalid_ind, fitnesses):
ind.fitness.values = fit

print('Evaluated', len(invalid_ind), 'individuals')

Replace the population with the next generation individuals:

# The population is entirely replaced by the offspring 
population[:] = offspring

Print the stats for the current generation to see how it’s progressing:

# Gather all the fitnesses in one list and print the stats 
fits = [ind.fitness.values[0] for ind in population]

length = len(population)
mean = sum(fits) / length
sum2 = sum(x*x for x in fits)
std = abs(sum2 / length - mean**2)**0.5

print('Min =', min(fits), ', Max =', max(fits))
print('Average =', round(mean, 2), ', Standard deviation =',
round(std, 2))

print("\n==== End of evolution")

Print the final output:

best_ind = tools.selBest(population, 1)[0] 
print('\nBest individual:\n', best_ind)
print('\nNumber of ones:', sum(best_ind))

The full code is given in the file bit_counter.py. If you run the code, you will see iterations printed to your Terminal. At the start, you will see something like the following:

At the end, you will see something like the following that indicates the end of the evolution:

As seen in the preceding figure, the evolution process ends after 60 generations (zero-indexed). Once it’s done, the best individual is picked and printed on the output. It has 45 ones in the best individual, which is like a confirmation for us because the target sum is 45 in our evaluation function.

Visualizing the evolution

et’s see how we can visualize the evolution process. In DEAP, they have used a method called Covariance Matrix Adaptation Evolution Strategy (CMA-ES) to visualize the evolution. It is an evolutionary algorithm that’s used to solve non-linear problems in the continuous domain. CMA-ES technique is robust, well studied, and is considered as state of the art in evolutionary algorithms. Let’s see how it works by delving into the code provided in their source code. The following code is a slight variation of the example shown in the DEAP library.

Create a new Python file and import the following:

import numpy as np 
import matplotlib.pyplot as plt
from deap import algorithms, base, benchmarks, \
cma, creator, tools

Define a function to create the toolbox. We will define a FitnessMin function using negative weights:

# Function to create a toolbox 
def create_toolbox(strategy):
creator.create("FitnessMin", base.Fitness, weights=(-1.0,))
creator.create("Individual", list, fitness=creator.FitnessMin)

Create the toolbox and register the evaluation function, as follows:

toolbox = base.Toolbox() 
toolbox.register("evaluate", benchmarks.rastrigin)

# Seed the random number generator

Register the generate and update methods. This is related to the generate-update paradigm where we generate a population from a strategy and this strategy is updated based on the population:

toolbox.register("generate", strategy.generate, creator.Individual) 
toolbox.register("update", strategy.update)

return toolbox

Define the main function. Start by defining the number of individuals and the number of generations:

if __name__ == "__main__": 
# Problem size
num_individuals = 10
num_generations = 125

We need to define a strategy before we start the process:

# Create a strategy using CMA-ES algorithm 
strategy = cma.Strategy(centroid=[5.0]*num_individuals, sigma=5.0,

Create the toolbox based on the strategy:

# Create toolbox based on the above strategy 
toolbox = create_toolbox(strategy)

Create a HallOfFame object. The HallOfFame object contains the best individual that ever existed in the population. This object is kept in a sorted format at all times. This way, the first element in this object is the individual that has the best fitness value ever seen during the evolution process:

# Create hall of fame object 
hall_of_fame = tools.HallOfFame(1)

Register the stats using the Statistics method:

# Register the relevant stats 
stats = tools.Statistics(lambda x: x.fitness.values)
stats.register("avg", np.mean)
stats.register("std", np.std)
stats.register("min", np.min)
stats.register("max", np.max)

Define the logbook to keep track of the evolution records. It is basically a chronological list of dictionaries:

logbook = tools.Logbook() 
logbook.header = "gen", "evals", "std", "min", "avg", "max"

Define objects to compile all the data:

# Objects that will compile the data 
sigma = np.ndarray((num_generations, 1))
axis_ratio = np.ndarray((num_generations, 1))
diagD = np.ndarray((num_generations, num_individuals))
fbest = np.ndarray((num_generations,1))
best = np.ndarray((num_generations, num_individuals))
std = np.ndarray((num_generations, num_individuals))

Iterate through the generations:

for gen in range(num_generations): 
# Generate a new population
population = toolbox.generate()

Evaluate individuals using the fitness function:

# Evaluate the individuals 
fitnesses = toolbox.map(toolbox.evaluate, population)
for ind, fit in zip(population, fitnesses):
ind.fitness.values = fit

Update the strategy based on the population:

# Update the strategy with the evaluated individuals 

Update the hall of fame and statistics with the current generation of individuals:

# Update the hall of fame and the statistics with the 
# currently evaluated population
record = stats.compile(population)
logbook.record(evals=len(population), gen=gen, **record)


Save the data for plotting:

# Save more data along the evolution for plotting 
sigma[gen] = strategy.sigma
axis_ratio[gen] = max(strategy.diagD)**2/min(strategy.diagD)**2
diagD[gen, :num_individuals] = strategy.diagD**2
fbest[gen] = hall_of_fame[0].fitness.values
best[gen, :num_individuals] = hall_of_fame[0]
std[gen, :num_individuals] = np.std(population, axis=0)

Define the x axis and plot the stats:

# The x-axis will be the number of evaluations 
x = list(range(0, strategy.lambda_ * num_generations, strategy.lambda_))
avg, max_, min_ = logbook.select("avg", "max", "min")
plt.semilogy(x, avg, "--b")
plt.semilogy(x, max_, "--b")
plt.semilogy(x, min_, "-b")
plt.semilogy(x, fbest, "-c")
plt.semilogy(x, sigma, "-g")
plt.semilogy(x, axis_ratio, "-r")
plt.title("blue: f-values, green: sigma, red: axis ratio")

Plot the progress:

plt.plot(x, best)
plt.title("Object Variables")

plt.semilogy(x, diagD)
plt.title("Scaling (All Main Axes)")

plt.semilogy(x, std)
plt.title("Standard Deviations in All Coordinates")


The full code is given in the file visualization.py. If you run the code, you will see four screenshots. The first screenshot shows various parameters:

The second screenshot shows object variables:

The third screenshot shows scaling:

The fourth screenshot shows standard deviations:

You will see the progress printed on the Terminal. At the start, you will see something like the following:

As seen from the preceding figure, the values keep decreasing as we progress. This indicates that it’s converging.

Solving the symbol regression problem

Let’s see how to use genetic programming to solve the symbol regression problem. It is important to understand that genetic programming is not the same as genetic algorithms. Genetic programming is a type of evolutionary algorithm in which the solutions occur in the form of computer programs. Basically, the individuals in each generation would be computer programs and their fitness level corresponds to their ability to solve problems. These programs are modified, at each iteration, using a genetic algorithm. In essence, genetic programming is the application of a genetic algorithm.

Coming to the symbol regression problem, we have a polynomial expression that needs to be approximated here. It’s a classic regression problem where we try to estimate the underlying function. In this example, we will use the expression: f(x) = 2x³ — 3x² + 4x — 1

The code discussed here is a variant of the symbol regression problem given in the DEAP library. Create a new Python file and import the following:

import operator 
import math
import random

import numpy as np
from deap import algorithms, base, creator, tools, gp

Create a division operator that can handle divide-by-zero error gracefully:

# Define new functions 
def division_operator(numerator, denominator):
if denominator == 0:
return 1

return numerator / denominator

Define the evaluation function that will be used for fitness calculation. We need to define a callable function to run computations on the input individual:

# Define the evaluation function 
def eval_func(individual, points):
# Transform the tree expression in a callable function
func = toolbox.compile(expr=individual)

Compute the mean squared error (MSE) between the function defined earlier and the original expression:

# Evaluate the mean squared error 
mse = ((func(x) - (2 * x**3 - 3 * x**2 - 4 * x + 1))**2 for x in points)

return math.fsum(mse) / len(points),

Define a function to create the toolbox. In order to create the toolbox here, we need to create a set of primitives. These primitives are basically operators that will be used during the evolution. They serve as building blocks for the individuals. We are going to use basic arithmetic functions as our primitives here:

# Function to create the toolbox 
def create_toolbox():
pset = gp.PrimitiveSet("MAIN", 1)
pset.addPrimitive(operator.add, 2)
pset.addPrimitive(operator.sub, 2)
pset.addPrimitive(operator.mul, 2)
pset.addPrimitive(division_operator, 2)
pset.addPrimitive(operator.neg, 1)
pset.addPrimitive(math.cos, 1)
pset.addPrimitive(math.sin, 1)

We now need to declare an ephemeral constant. It is a special terminal type that does not have a fixed value. When a given program appends such an ephemeral constant to the tree, the function gets executed. The result is then inserted into the tree as a constant terminal. These constant terminals can take the values -1, 0 or 1:

pset.addEphemeralConstant("rand101", lambda: random.randint(-1,1))

The default name for the arguments is ARGx. Let’s rename it x. It’s not exactly necessary, but it’s a useful feature that comes in handy:


We need to define two object types — fitness and an individual. Let’s do it using the creator:

creator.create("FitnessMin", base.Fitness, weights=(-1.0,)) 
creator.create("Individual", gp.PrimitiveTree, fitness=creator.FitnessMin)

Create the toolbox and register the functions. The registration process is similar to previous sections:

toolbox = base.Toolbox() 

toolbox.register("expr", gp.genHalfAndHalf, pset=pset, min_=1, max_=2)
toolbox.register("individual", tools.initIterate, creator.Individual, toolbox.expr)
toolbox.register("population", tools.initRepeat, list, toolbox.individual)
toolbox.register("compile", gp.compile, pset=pset)
toolbox.register("evaluate", eval_func, points=[x/10. for x in range(-10,10)])
toolbox.register("select", tools.selTournament, tournsize=3)
toolbox.register("mate", gp.cxOnePoint)
toolbox.register("expr_mut", gp.genFull, min_=0, max_=2)
toolbox.register("mutate", gp.mutUniform, expr=toolbox.expr_mut, pset=pset)

toolbox.decorate("mate", gp.staticLimit(key=operator.attrgetter("height"), max_value=17))
toolbox.decorate("mutate", gp.staticLimit(key=operator.attrgetter("height"), max_value=17))

return toolbox

Define the main function and start by seeding the random number generator:

if __name__ == "__main__": 

Create the toolbox object:

toolbox = create_toolbox()

Define the initial population using the method available in the toolbox object. We will use 450 individuals. The user defines this number, so you should feel free to experiment with it. Also define the hall_of_fame object:

population = toolbox.population(n=450) 
hall_of_fame = tools.HallOfFame(1)

Statistics are useful when we build genetic algorithms. Define the stats objects:

stats_fit = tools.Statistics(lambda x: x.fitness.values) 
stats_size = tools.Statistics(len)

Register the stats using the objects defined previously:

mstats = tools.MultiStatistics(fitness=stats_fit, size=stats_size) 
mstats.register("avg", np.mean)
mstats.register("std", np.std)
mstats.register("min", np.min)
mstats.register("max", np.max)

Define the crossover probability, mutation probability, and the number of generations:

probab_crossover = 0.4 
probab_mutate = 0.2
num_generations = 60

Run the evolutionary algorithm using the above parameters:

population, log = algorithms.eaSimple(population, toolbox, 
probab_crossover, probab_mutate, num_generations,
stats=mstats, halloffame=hall_of_fame, verbose=True)

The full code is given in the file symbol_regression.py. If you run the code, you will see the following on your Terminal at the start of the evolution:

At the end, you will see the following:

Building an intelligent robot controller

Let’s see how to build a robot controller using a genetic algorithm. We are given a map with the targets sprinkled all over it. The map looks like this:

There are 124 targets in the preceding map. The goal of the robot controller is to automatically traverse the map and consume all those targets. This program is a variant of the artificial ant program given in the deap library.

Create a new Python file and import the following:

import copy 
import random
from functools import partial

import numpy as np
from deap import algorithms, base, creator, tools, gp

Create the class to control the robot:

class RobotController(object): 
def __init__(self, max_moves):
self.max_moves = max_moves
self.moves = 0
self.consumed = 0
self.routine = None

Define the directions and movements:

self.direction = ["north", "east", "south", "west"] 
self.direction_row = [1, 0, -1, 0]
self.direction_col = [0, 1, 0, -1]

Define the reset functionality:

def _reset(self): 
self.row = self.row_start
self.col = self.col_start
self.direction = 1
self.moves = 0
self.consumed = 0
self.matrix_exc = copy.deepcopy(self.matrix)

Define the conditional operator:

def _conditional(self, condition, out1, out2): 
out1() if condition() else out2()

Define the left turning operator:

def turn_left(self): 
if self.moves < self.max_moves:
self.moves += 1
self.direction = (self.direction - 1) % 4

Define the right turning operator:

def turn_right(self): 
if self.moves < self.max_moves:
self.moves += 1
self.direction = (self.direction + 1) % 4

Define the method to control how the robot moves forward:

def move_forward(self): 
if self.moves < self.max_moves:
self.moves += 1
self.row = (self.row + self.direction_row[self.direction]) %
self.col = (self.col + self.direction_col[self.direction]) %

if self.matrix_exc[self.row][self.col] == "target":
self.consumed += 1

self.matrix_exc[self.row][self.col] = "passed"

Define a method to sense the target. If you see the target ahead, then update the matrix accordingly:

def sense_target(self): 
ahead_row = (self.row + self.direction_row[self.direction]) % self.matrix_row
ahead_col = (self.col + self.direction_col[self.direction]) % self.matrix_col
return self.matrix_exc[ahead_row][ahead_col] == "target"

If you see the target ahead, then create the relevant function and return it:

def if_target_ahead(self, out1, out2): 
return partial(self._conditional, self.sense_target, out1, out2)

Define the method to run it:

def run(self,routine): 
while self.moves < self.max_moves:

Define a function to traverse the input map. The symbol # indicates all the targets on the map and the symbol S indicates the starting point. The symbol . denotes empty cells:

def traverse_map(self, matrix): 
self.matrix = list()
for i, line in enumerate(matrix):

for j, col in enumerate(line):
if col == "#":

elif col == ".":

elif col == "S":
self.row_start = self.row = i
self.col_start = self.col = j
self.direction = 1

self.matrix_row = len(self.matrix)
self.matrix_col = len(self.matrix[0])
self.matrix_exc = copy.deepcopy(self.matrix)

Define a class to generate functions depending on the number of input arguments:

class Prog(object): 
def _progn(self, *args):
for arg in args:

def prog2(self, out1, out2):
return partial(self._progn, out1, out2)

def prog3(self, out1, out2, out3):
return partial(self._progn, out1, out2, out3)

Define an evaluation function for each individual:

def eval_func(individual): 
global robot, pset

# Transform the tree expression to functional Python code
routine = gp.compile(individual, pset)

Run the current program:

# Run the generated routine 
return robot.consumed,

Define a function to create the toolbox and add primitives:

def create_toolbox(): 
global robot, pset

pset = gp.PrimitiveSet("MAIN", 0)
pset.addPrimitive(robot.if_target_ahead, 2)
pset.addPrimitive(Prog().prog2, 2)
pset.addPrimitive(Prog().prog3, 3)

Create the object types using the fitness function:

creator.create("FitnessMax", base.Fitness, weights=(1.0,)) 
creator.create("Individual", gp.PrimitiveTree, fitness=creator.FitnessMax)

Create the toolbox and register all the operators:

toolbox = base.Toolbox() 

# Attribute generator
toolbox.register("expr_init", gp.genFull, pset=pset, min_=1, max_=2)

# Structure initializers
toolbox.register("individual", tools.initIterate, creator.Individual, toolbox.expr_init)
toolbox.register("population", tools.initRepeat, list, toolbox.individual)

toolbox.register("evaluate", eval_func)
toolbox.register("select", tools.selTournament, tournsize=7)
toolbox.register("mate", gp.cxOnePoint)
toolbox.register("expr_mut", gp.genFull, min_=0, max_=2)
toolbox.register("mutate", gp.mutUniform, expr=toolbox.expr_mut, pset=pset)

return toolbox

Define the main function and start by seeding the random number generator:

if __name__ == "__main__": 
global robot

# Seed the random number generator

Create the robot controller object using the initialization parameter:

# Define the maximum number of moves 
max_moves = 750

# Create the robot object
robot = RobotController(max_moves)

Create the toolbox using the function we defined earlier:

# Create the toolbox 
toolbox = create_toolbox()

Read the map data from the input file:

# Read the map data 
with open('target_map.txt', 'r') as f:

Define the population with 400 individuals and define the hall_of_fame object:

# Define population and hall of fame variables 
population = toolbox.population(n=400)
hall_of_fame = tools.HallOfFame(1)

Register the stats:

# Register the stats 
stats = tools.Statistics(lambda x: x.fitness.values)
stats.register("avg", np.mean)
stats.register("std", np.std)
stats.register("min", np.min)
stats.register("max", np.max)

Define the crossover probability, mutation probability, and the number of generations:

# Define parameters 
probab_crossover = 0.4
probab_mutate = 0.3
num_generations = 50

Run the evolutionary algorithm using the parameters defined earlier:

# Run the algorithm to solve the problem 
algorithms.eaSimple(population, toolbox, probab_crossover,
probab_mutate, num_generations, stats,

The full code is given in the file robot.py. If you run the code, you will get the following on your Terminal:

Towards the end, you will see the following:

Source: Artificial Intelligence on Medium

(Visited 70 times, 1 visits today)
Post a Comment