Deep Learning Architectures for AM

Topic Survey
Area Deep Learning + AM
Papers (250+ cit) 2,726
Total Citations 1.8M+
Key Architectures CNN, GAN, VAE, PINN
AM Tasks Design, Monitoring, QC
Speedup 100-1000x vs FEM

Deep learning architectures provide the computational backbone for ML applications in additive manufacturing. From CNNs for image-based quality control to GANs for design generation and PINNs for physics-constrained simulation, understanding these architectures is essential for implementing ML in AM workflows.

Contents
  1. Overview
  2. Convolutional Neural Networks
  3. Generative Models (GAN, VAE)
  4. Physics-Informed Neural Networks
  5. Transformers & Attention
  6. Key Papers

Overview

Deep learning has revolutionized AM by enabling: (1) real-time process monitoring through image analysis, (2) surrogate models that replace expensive simulations, (3) generative design of novel structures, and (4) physics-constrained predictions that respect conservation laws.

2,726
Capstone Papers
1000x
Speedup vs FEM
>95%
Defect Detection
Real-time
Inference Speed

Convolutional Neural Networks

AM Applications

Application Architecture Input Output
Defect classification ResNet, VGG Layer images Defect type
Melt pool analysis Custom CNN Thermal images Geometry features
Microstructure prediction U-Net Process parameters Grain structure
Surface roughness CNN regression Surface images Ra value
Topology optimization Encoder-decoder Boundary conditions Material distribution

Key Architectures

ResNet (Residual Networks)

Skip connections enable training very deep networks. Used for classification tasks with high accuracy.

  • Depth: 18, 50, 101, 152 layers
  • AM use: Defect classification, quality prediction

U-Net

Encoder-decoder with skip connections. Standard for segmentation and image-to-image translation.

  • Strength: Works with small datasets
  • AM use: Melt pool segmentation, porosity detection

EfficientNet

Compound scaling of depth, width, resolution. Best accuracy/efficiency tradeoff.

  • Versions: B0-B7 (increasing complexity)
  • AM use: Real-time monitoring with limited compute

Generative Models (GAN, VAE)

Generative Adversarial Networks

GANs learn to generate realistic samples by adversarial training between generator and discriminator.

GAN Type AM Application Key Feature
cGAN (Conditional) Design generation Controls output properties
Pix2Pix Image-to-image translation Thermal to defect maps
CycleGAN Domain adaptation Sim-to-real transfer
StyleGAN Microstructure synthesis High-resolution, controllable
3D-GAN Lattice structure design Volumetric generation

Variational Autoencoders

VAEs learn compressed latent representations useful for design exploration:

Diffusion Models

Emerging architecture for high-quality generation:

Physics-Informed Neural Networks

PINNs embed physical laws directly into neural network training, ensuring predictions respect conservation of mass, momentum, and energy.

Formulation

Loss function combines data fit and physics residuals:

AM Applications

Physics Governing Equation AM Application
Heat transfer Heat equation Melt pool temperature prediction
Fluid flow Navier-Stokes Melt pool convection
Solid mechanics Elasticity equations Residual stress prediction
Phase change Stefan problem Solidification modeling

Advantages for AM

Transformers & Attention

Self-Attention Mechanism

Captures long-range dependencies in sequential or spatial data:

AM Applications

Architecture Application Advantage
Vision Transformer Layer-wise defect detection Global context
Temporal Transformer Sensor data analysis Long-range dependencies
Graph Transformer Lattice structure analysis Topology awareness
Point Cloud Transformer 3D scan processing Unordered point sets

Key Papers

Deep Residual Learning for Image Recognition
He et al., CVPR 2016 | ResNet foundation | 180,000+ citations
Generative Adversarial Nets
Goodfellow et al., NeurIPS 2014 | GAN foundation | 60,000+ citations
Physics-informed neural networks
Raissi et al., JCP 2019 | PINN methodology | 8,000+ citations
Attention Is All You Need
Vaswani et al., NeurIPS 2017 | Transformer architecture | 100,000+ citations
Deep learning for additive manufacturing: State of the art
Survey of DL methods in AM | 1,500+ citations

View all deep learning papers →