![Loading Events](https://idss.mit.edu/wp-content/plugins/the-events-calendar/src/resources/images/tribe-loading.gif)
Finite-Particle Convergence Rates for Stein Variational Gradient Descent
March 7, 2025 @ 11:00 am - 12:00 pm
Krishna Balasubramanian (University of California - Davis)
E18-304
Event Navigation
Abstract:
Stein Variational Gradient Descent (SVGD) is a deterministic, interacting particle-based algorithm for nonparametric variational inference, yet its theoretical properties remain challenging to fully understand. This talk presents two complementary perspectives on SVGD. First, we introduce Gaussian-SVGD, a framework that projects SVGD onto the family of Gaussian distributions using a bilinear kernel. We establish rigorous convergence results for both mean-field dynamics and finite-particle systems, proving linear convergence to equilibrium in strongly log-concave settings. This framework also unifies recent algorithms for Gaussian Variational Inference (GVI) under a single theoretical lens. Second, we examine the finite-particle convergence rates of nonparametric SVGD in Kernelized Stein Discrepancy (KSD) and Wasserstein-2 metrics. By decomposing the time derivative of relative entropy, we derive near-optimal convergence rates with polynomial dependence on dimensionality for certain kernel families. We also outline a framework to compare deterministic SVGD algorithms to the more standard randomized MCMC algorithms.
Bio:
Dr. Krishnakumar Balasubramanian works at the interface of statistics, optimization and machine learning with a focus on addressing inferential and computational aspects. He received his Ph.D from Georgia Tech. After appointments in Princeton University and and UW-Madison, he joined the Department of Statistics, UC Davis in Fall 2018. He has received a Facebook fellowship award and an ICML best paper runner-up award.