This project implements a hybrid optimization approach combining Artificial Neural Networks (ANN) for fitness function approximation with Genetic Algorithms (GA) for finding local optima. The method leverages the prediction capabilities of neural networks while using evolutionary search to explore the solution space effectively.

- Three-layer Multilayer Perceptron (MLP) for function approximation
- Genetic Algorithm optimization with customizable parameters
- Feature importance analysis
- Visualization of search progress and results
The implementation uses a three-layer MLP with the following architecture:
class Net(nn.Module):
def __init__(self, input_size, hidden_size1, hidden_size2, output_size):
super(Net, self).__init__()
self.fc1 = nn.Linear(input_size, hidden_size1)
self.fc2 = nn.Linear(hidden_size1, hidden_size2)
self.fc3 = nn.Linear(hidden_size2, output_size)
self.relu = nn.ReLU()
def forward(self, x):
x = self.relu(self.fc1(x))
x = self.relu(self.fc2(x))
output = self.fc3(x)
return output
- Selection: Tournament selection for choosing parent solutions
- Crossover: Uniform crossover with 0.5 probability
- Mutation: Gaussian mutation with 0.2 probability
- Population replacement: Generational replacement strategy
git clone https://github.com/yourusername/hybrid-nn-ga.git
cd hybrid-nn-ga
pip install -r requirements.txt
python generic_NN_GA.py
- Initialize population randomly
- Train neural network on initial samples
- For each generation:
- Select parents using tournament selection
- Apply crossover and mutation operators
- Evaluate offspring using trained neural network
- Update population
- Update hall of fame and statistics
- Record best solutions and population statistics
NGEN
: Number of generations- Population size: Configurable based on problem complexity
- Crossover rate: 0.5
- Mutation rate: 0.2
for gen in range(NGEN):
# Select the next generation individuals
offspring = toolbox.select(pop, len(pop))
offspring = list(map(toolbox.clone, offspring))
# Apply crossover and mutation
for child1, child2 in zip(offspring[::2], offspring[1::2]):
if random.random() < 0.5:
toolbox.mate(child1, child2)
del child1.fitness.values
del child2.fitness.values
for mutant in offspring:
if random.random() < 0.2:
toolbox.mutate(mutant)
del mutant.fitness.values
# Evaluate the individuals with an invalid fitness
invalid_ind = [ind for ind in offspring if not ind.fitness.valid]
fitnesses = toolbox.map(toolbox.evaluate, invalid_ind)
for ind, fit in zip(invalid_ind, fitnesses):
ind.fitness.values = fit
# Replace the old population with the offspring
pop[:] = offspring
# Update the hall of fame and the statistics
hof.update(pop)
record = stats.compile(pop)
logbook.record(evals=len(invalid_ind), gen=gen, **record)
print(logbook.stream)
# Save more data along the evolution for later plotting
fbest[gen] = hof[0].fitness.values
best[gen, :N] = hof[0]
std[gen, :N] = np.std([ind for ind in pop], axis=0)
print("Best individual is:", hof[0], hof[0].fitness.values)
The algorithm provides visualizations of convergence plots, population diversity, and best solution evolution.
The system includes feature importance analysis to identify key variables in the optimization process.
The search pattern resembles ant colony optimization, exploring multiple paths to find optimal solutions.
- PyTorch
- DEAP
- NumPy
- Matplotlib
- Base ant colony visualization: Wikimedia Commons
- DEAP framework documentation
- PyTorch neural network documentation