You are here

MIA Talks

Primer: Stochastic gradient-based variational inference

October 21, 2020
Pyro team, Broad Institute

Black box variational inference algorithms (BBVI), which recast the intractable integrals that appear in Bayesian inference as optimization problems that can be solved using off-the-shelf scientific computing and automatic differentiation software, have greatly expanded the range and scale of problems addressable in practice by Bayesian methods. In this primer, we give an introduction to Bayesian inference with BBVI grounded in concrete example models. We will start with an introduction to Bayesian modeling, then discuss variational inference including the variational objective function, choice of approximate posterior distribution, data subsampling and gradient estimation with automatic differentiation. Finally, within this framework we will construct variational autoencoders (VAEs), which incorporate neural networks into both models and inference.