Stat Colloquium: Dr. Ray Bai
George Mason University
Friday, February 20, 2026 · 11 AM - 12 PM
Title: Deep Generative Models for Statistical Problems: Methods, Computation, and Theory
Abstract: Generative AI powered by deep generative models (DGMs) has seen recent remarkable empirical successes in a number of domains. At its core, a DGM learns a deterministic mapping from a source distribution to a target probability distribution. In this talk, we discuss two projects related to DGMs. In the first project, we introduce a new generative model for quantile regression. Our method simultaneously generates samples from many random quantile levels of a response Y given covariate X, allowing us to infer the full conditional density of Y given X. To counteract the problem of training data memorization, we introduce a novel variability penalty on the generator. We further introduce a new family of partial monotonic neural networks (PMNN) to circumvent the problem of crossing quantile curves.
Despite their appeal, DGMs often struggle to learn heavy-tailed distributions. In the second project, we analyze the fundamental nature of this limitation through the lens of functional inequalities. We show that under the widely used Gaussian source distribution, the transport map must necessarily have increasingly large norm as the target distribution's moments become larger. These results are then extended to general log-concave source distributions. Notably, our results are intrinsic to the underlying transport problem and are independent of specific architectures, parameterizations, or training schemes. This has several practical implications for the design and analysis of DGMs.