
Presenter: Sifan Liu, Assistant Professor, Department of Statistical Science, Duke University
Description:Mean-field variational inference (MFVI) approximates a target distribution with a product distribution in the standard coordinate system, offering a scalable approach to Bayesian inference but often severely underestimating uncertainty due to neglected dependence. We show that MFVI can be greatly improved when performed along carefully chosen principal component axes rather than the standard coordinates. The principal components are obtained from a cross-covariance matrix of the target’s score function and identify orthogonal directions that capture the dominant discrepancies between the target distribution and a Gaussian reference. Performing MFVI in a rotated system defines a rotation followed by a coordinatewise transformation that moves the target closer to Gaussian. Iterating this procedure yields a sequence of transformations that progressively Gaussianize the target. The resulting algorithm provides a computationally efficient construction of normalizing flows, requiring only MFVI sub-problems and avoiding large-scale optimization. In posterior sampling tasks, we demonstrate that the proposed method greatly outperforms standard MFVI while achieving accuracy comparable to normalizing flows at a much lower computational cost.
Bio: Sifan Liu is an Assistant Professor in the Department of Statistical Science at Duke University. She was previously a research scientist at the Flatiron Institute and received her Ph.D. in Statistics from Stanford University. Her research interests include sampling, generative modeling, and selective inference.
Hosted by: Statistics Department