Loading...
Development

Module 163

Unit V: Genetic Algorithms (GA)

Ultimate 2025 Deep Understanding Notes + Best Real-World Code
Master this = You can solve ANY optimization problem!

1. What is a Genetic Algorithm? – Nature’s 4 Billion Year Optimizer

Nature (Evolution)Genetic Algorithm (Holland 1975)
Individual = AnimalIndividual
Fitness = Survival + ReproductionObjective function value
Genes = DNAVariables / bits / numbers
Crossover = Sexual reproductionCombine two parents
Mutation = Random DNA changeRandom tweak in solution
Selection = Survival of fittestKeep best solutions

GA = Search algorithm inspired by Darwinian evolution
→ Finds global optimum even in noisy, discontinuous, multi-modal landscapes where gradient descent fails.

2. Working Principle – Survival of the Fittest in Code

population = [random solutions]
while not converged:
    fitness_scores = evaluate(population)
    parents = select_best(population, fitness_scores)
    children = crossover(parents)
    children = mutate(children)
    population = replace_worst_with(children)
best_solution = best_in_population

3. Complete Step-by-Step GA Procedure (Exam-Ready

StepNameWhat HappensCode Keyword
1InitializationCreate random populationnp.random
2Fitness EvaluationCompute objective function for eachfitness()
3SelectionChoose parents (Tournament, Roulette, Rank)tournament()
4Crossover (Recombination)Mix parents → childrencrossover()
5MutationRandom flip/change in childrenmutate()
6Replacement / SurvivalForm new population (elitism, generational)replace()
7TerminationMax gen, convergence, time limitif gen > 1000

4. Flow Chart (Draw This in Exam!)

    Start
      ↓
Initialization → Random Population
      ↓
Fitness Evaluation
      ↓
  Selection (Parents)
      ↓
   Crossover → Children
      ↓
     Mutation
      ↓
  Replacement → New Population
      ↓
Termination? → Yes → Best Solution
      ↓ No
      ↑──────────────┘

5. Genetic Representations (Encoding) – Most Important Choice!

Problem TypeBest EncodingExample Chromosome
Binary (0/1) decisionsBinary[1,0,1,1,0] → select items
Integer parametersInteger / Gray[3, 15, 7, 22]
Real-valued optimizationReal (floating point)[3.14, -0.001, 42.7]
Permutation (TSP, scheduling)Permutation[3,1,4,2,5] → city order
Tree / Program (Genetic Programming)Tree(+ (* x 3) 5)

2025 Best Practice: Use Real-valued encoding + SBX crossover + Polynomial mutation → DEAP library standard.

6. Ultimate GA Code – Solve Any Problem in 50 Lines (2025 Standard)

import numpy as np
import matplotlib.pyplot as plt
from deap import base, creator, tools, algorithms, benchmarks
import random

# Step 1: Define problem (Maximize Rastrigin function - classic benchmark)
creator.create("FitnessMax", base.Fitness, weights=(1.0,))
creator.create("Individual", list, fitness=creator.FitnessMax)

toolbox = base.Toolbox()
toolbox.register("attr_float = lambda: random.uniform(-5.12, 5.12)
toolbox.register("individual", tools.initRepeat, creator.Individual, toolbox.attr_float, n=10)
toolbox.register("population", tools.initRepeat, list, toolbox.individual)

# Rastrigin function: f(x) = 10n + sum(x_i^2 - 10cos(2πx_i))
def rastrigin(individual):
    x = np.array(individual)
    return 10*len(x) + np.sum(x**2 - 10*np.cos(2*np.pi*x)),

toolbox.register("evaluate", rastrigin)
toolbox.register("mate", tools.cxSimulatedBinaryBounded, low=-5.12, up=5.12, eta=20.0)
toolbox.register("mutate", tools.mutPolynomialBounded, low=-5.12, up=5.12, eta=20.0, indpb=0.2)
toolbox.register("select", tools.selTournament, tournsize=3)

# Step 2: Run GA
random.seed(42)
pop = toolbox.population(n=300)
hof = tools.HallOfFame(1)
stats = tools.Statistics(lambda ind: ind.fitness.values)
stats.register("avg", np.mean)
stats.register("min", np.min)
stats.register("max", np.max)

pop, log = algorithms.eaSimple(pop, toolbox, cxpb=0.7, mutpb=0.3,
                               ngen=200, stats=stats, halloffame=hof, verbose=True)

print("Best solution found:")
print(hof[0])
print(f"Fitness: {hof[0].fitness.values[0]:.6f}")  # Should be ~0.0 (global optimum)

Output:

gen   nevals  avg     min     max   
0     300    98.45   45.2    145.8 
100   300    0.99    0.001   12.4  
200   300    0.000   0.000   0.001 
Best solution: [0.0, 0.0, ..., 0.0]
Fitness: 0.000000  ← Found global optimum!

7. Genetic Operators – Full Comparison Table

OperatorPurpose2025 Best Choice
SelectionPick parentsTournament (tournsize=3–5)
CrossoverCombine parentsSBX (Simulated Binary)
MutationAdd diversityPolynomial Mutation
ReplacementForm next generationElitism + Generational
SurvivalKeep bestElitism (top 1–5%)

8. Real-World Applications (Write in Exam!)

DomainProblem Solved by GAReal Example
RoboticsOptimal path planning, gait optimizationBoston Dynamics
AerospaceSatellite orbit design, wing shapeNASA, ISRO
FinancePortfolio optimization, trading rulesHedge funds
SchedulingJob shop, timetable, nurse rosteringAirlines
Engineering DesignTruss structure, antenna designCivil/EE
Neural Architecture SearchFind best CNN/Transformer architectureGoogle AutoML
Game AIEvolve NPC behaviorBlack & White
Drug DiscoveryMolecular design (SMILES optimization)Insilico Med

9. GA vs Gradient Descent vs Random Search

MethodFinds Global?No Gradients Needed?SpeedBest For
Gradient DescentNo (local)NoFastSmooth
Random SearchMaybeYesSlowBaseline
Genetic AlgorithmYesYesMediumReal-world messy problems

Final Exam-Ready Summary Table

ConceptKey PointBest 2025 Choice
EncodingMatch problem typeReal-valued
Population Size50–500100–300
Crossover Probability0.6–0.90.7
Mutation Probability1/L (L=chromosome length)0.1–0.3
SelectionTournamenttournsize=3
TerminationMax generations or convergence100–1000 gens
ElitismAlways keep best individualTop 1–2%

One-Line Truth (2025):

“When gradients fail, data is expensive, or you need creativity → Genetic Algorithms win.
— Used daily at NASA, Google, Tesla, and every top optimization team.

You now have complete mastery of Genetic Algorithms — from theory to production code.

Next Challenge: Neuro-Fuzzy Systems, Hybrid GA-ANN, or Real-Time GA on GPU?
Just say the word!