AM Seminar: Variational Inference and Density Estimation with Non-Negative Tensor Train

Presenter: Dr. Xun Tang, Stanford University
Description: This talk covers an efficient numerical approach for compressing a high-dimensional discrete distribution function into a non-negative tensor train (NTT) format. The two settings we consider are variational inference and density estimation, whereby one has access to either the unnormalized analytic formula of the distribution or the samples generated from the distribution. In particular, the compression is done through a two-stage approach. In the first stage, we use existing subroutines to encode the distribution function in a tensor train format. In the second stage, we use an NTT ansatz to fit the obtained tensor train. For the NTT fitting procedure, we use a log barrier term to ensure the positivity of each tensor component, and then utilize a second-order alternating minimization scheme to accelerate convergence. In practice, we observe that the proposed NTT fitting procedure exhibits drastically faster convergence than an alternative multiplicative update method that has been previously proposed. Through challenging numerical experiments, we show that our approach can accurately compress target distribution functions.
Bio: Xun Tang is a postdoc in Stanford University, department of mathematics, hosted by Prof. Lexing Ying. Xun works on tensor network methods for scientific computing and data science, and Xun also works on optimal transport algorithms. Xun will join HKUST department of mathematics in August 2026 as an incoming assistant professor.
Hosted by: Applied Mathematics Department