Dept. of Computer Science, Princeton University

To model data we desire to express assumptions about the data, infer hidden structure, make predictions, and simulate new data. In this talk, I will describe how probabilistic generative models provide a common toolkit to meet these challenges. I will first present these ideas in a toy setting followed by discussing the range of probabilistic generative models from structural to algorithmic. Next I will present an in depth view of deep exponential families, a class of probability models containing both predictive and interpretive models. I will end with the central computational problem in realizing the promise of probabilistic generative models: posterior inference. I will demonstrate why deriving inference is tedious and will touch on black box variational methods which seek to alleviate this burden.

MIA Talks Search