Statistics Colloquium : Dr. Debdeep Pati
TAMU
Abstract: In this talk, we aim to understand statistical properties of variational inference, an approximate method for posterior computation in Bayesian models. We propose a family of variational approximations to Bayesian posterior distributions, of which the standard variational approximation is a special case. A novel class of variational inequalities are developed for linking the Bayes risk under the variational approximation to the objective function in the variational optimization problem, implying that maximizing the evidence lower bound in variational inference has the effect of minimizing the Bayes risk within the variational density family. Operating in a frequentist setup, the variational inequalities imply that point estimates constructed from the procedure converge at an optimal rate to the true parameter in a wide range of problems. We illustrate our general theory with a number of examples, including the mean-field inference in high-dimensional linear regression, latent variable models including mixture of Gaussian, latent Dirichlet allocation, minorization based variational inference in non-conjugate models and expressive variational families like auto-encoders and implicit variational inference. I'll conclude the talk with some open problems.