Optimization Algorithms
Optimization algorithms are methods for finding the best solution from a set of possible solutions. In machine learning and additive manufacturing, metaheuristic algorithms—inspired by natural phenomena—are widely used to optimize complex, non-linear problems where traditional methods fail.
These algorithms excel at finding near-optimal solutions for problems with many parameters, non-smooth objectives, or no closed-form solution. In AM, they optimize process parameters (temperature, speed, layer height) to maximize part quality or minimize defects.
Overview
Traditional gradient-based optimization requires smooth, differentiable objective functions. Metaheuristic algorithms work without derivatives, making them suitable for:
- Black-box functions: When you can evaluate f(x) but can't compute gradients
- Non-convex landscapes: Multiple local minima where gradient methods get stuck
- Mixed variables: Combinations of continuous and discrete parameters
- Expensive evaluations: When each evaluation requires a simulation or experiment
Particle Swarm Optimization (PSO)
How It Works
Inspired by bird flocking and fish schooling. A "swarm" of particles moves through the solution space, each tracking its personal best position and the global best found by any particle.
Update rules:
- Velocity = inertia × current velocity + cognitive × (personal best - position) + social × (global best - position)
- Position = position + velocity
Iteration 1 Iteration 10 Iteration 50
○ ○ ○○ ○
○ ○ ○ ○○ ○ ○○○
○ ○ ───▶ ○○○ ───▶ ○●○
○ ○ ○○ ○○
○ ○ ○
(scattered) (converging) (at optimum ●)
Key Parameters
- Swarm size: Number of particles (typically 20-50)
- Inertia weight (w): Controls momentum (0.4-0.9)
- Cognitive coefficient (c₁): Personal best attraction (~2.0)
- Social coefficient (c₂): Global best attraction (~2.0)
Genetic Algorithm (GA)
How It Works
Inspired by natural evolution. A population of solutions evolves over generations through selection, crossover, and mutation.
Steps:
- Initialize: Random population of solutions (chromosomes)
- Evaluate: Compute fitness of each solution
- Select: Choose parents based on fitness (tournament, roulette)
- Crossover: Combine parents to create offspring
- Mutate: Random changes to maintain diversity
- Repeat until convergence
Parent 1: [A B C D E F] Parent 2: [a b c d e f]
↓ ↓
──────┼────── (crossover point)
↓
Child 1: [A B C d e f] Child 2: [a b c D E F]
↓ ↓
(mutation)
↓
Child 1: [A B C d X f] (X = random mutation)
Key Parameters
- Population size: Number of solutions per generation (50-200)
- Crossover rate: Probability of crossover (0.6-0.9)
- Mutation rate: Probability of mutation (0.01-0.1)
- Selection pressure: How strongly fitness affects selection
Simulated Annealing (SA)
How It Works
Inspired by metallurgical annealing. A single solution explores the space, occasionally accepting worse solutions to escape local minima. The probability of accepting worse solutions decreases with "temperature."
Acceptance rule:
- If new solution is better: always accept
- If worse: accept with probability exp(-ΔE / T)
Objective
▲
│ ┌─┐
│ ┌─┘ │ Global
│ ┌─┘ │ Optimum
│ │ │ ↓
│ │ └───●───
│ │
│ │ Local
│ └─Minimum
│
└─────────────────▶ Solution Space
High T: Can "jump" out of local minimum
Low T: Refines solution near global optimum
Key Parameters
- Initial temperature (T₀): Starting temperature (problem-dependent)
- Cooling rate (α): How fast temperature decreases (0.8-0.99)
- Final temperature: When to stop
- Iterations per temperature: Steps before cooling
Comparison
| Algorithm | Inspiration | Parallelism | Best For | Weaknesses |
|---|---|---|---|---|
| PSO | Bird flocking | Parallel (swarm) | Continuous optimization, fast convergence | May converge prematurely |
| GA | Evolution | Parallel (population) | Discrete/mixed variables, complex landscapes | Many parameters to tune |
| SA | Metal annealing | Sequential | Combinatorial optimization, simple to implement | Slow, single solution |
Applications in Additive Manufacturing
Find optimal temperature, speed, and layer height to maximize tensile strength. Hassan et al. (2024) notes that metaheuristic algorithms combined with surrogate models (like neural networks) can find optimal print parameters with fewer experiments than full factorial designs.
GA is commonly used to find the optimal orientation for minimizing support material, build time, or surface roughness—a discrete optimization problem with complex interactions.
Common Use Cases
- Multi-objective optimization: Balancing strength, cost, and time
- Neural network hyperparameters: Finding optimal learning rate, layer sizes
- Path planning: Optimizing tool paths for minimal print time
- Lattice structure design: Finding optimal unit cell parameters
Hybrid Approaches
Modern AM optimization often combines metaheuristics with:
- Surrogate models: Use ML models to approximate expensive simulations
- Bayesian optimization: Sample-efficient alternative for expensive functions
- Multi-fidelity methods: Use cheap approximations to guide expensive evaluations
See Also
- Machine Learning
- Reinforcement Learning — Another approach to sequential decision making
- Process Optimization — AM parameter tuning
References
- Kennedy, J. & Eberhart, R. (1995). Particle swarm optimization. IEEE Int. Conf. Neural Networks.
- Holland, J.H. (1975). Adaptation in Natural and Artificial Systems. University of Michigan Press.
- Kirkpatrick, S., Gelatt, C.D., & Vecchi, M.P. (1983). Optimization by simulated annealing. Science, 220(4598), 671-680.
- Hassan, M., et al. (2024). A review of AI for optimization of 3D printing. Composites Part C. DOI