We often have discrete count data with continuous latent structure or continuous regressors. It can be hard to match these two up in a Bayesian framework because of lack of conjugacy. Fortunately, there's a cool trick (Polya-gamma augmentation) that allows us to render the discrete observations conjugate with a Gaussian prior, facilitating:
- Bayesian logistic regression, more efficiently
- structured sparse Gaussian models
- hierarchical Gaussian models (eg GMMs) with binary observations
- time series or Gaussian processes to capture dependencies between observations
We can extend this to other observation models too, like binomial, negative binomial, and multinomial observations. So if you know about LDA, now it's easy to combine LDA with Gaussian structure like correlated or dynamic topics.