You are here

MIA Talks

Primer: Only connect: The variety and splendor of neural network architectures

March 4, 2020
Data Sciences Platform, Broad Institute

Beginning with a brief history of connectionism and Convolutional Neural Networks (CNN), we will present several recent innovations in neural network architecture design. Motivated by the vanishing and exploding gradients problem, we show how both residual and long-range skip connections allow models to grow deeper and more powerful. Skip connections organized hierarchically, as in the U-Net architecture, naturally apply to segmentation problems, like the anatomical segmentation of cardiac MRI. We conclude with a discussion of recent innovations in CNN training including one-cycle learning and adaptive pooling.