Diffusion models and minimax rates: scores, functionals, and tests
February 6, 2026 @ 11:00 am - 12:00 pm
Subhodh Kotekal (MIT)
E18-304
Event Navigation
Abstract: While score-based diffusion models have achieved remarkable success in high-dimensional generative modeling, some basic theoretical questions have not been precisely resolved. In this talk, we address minimax optimality of density estimation, functional estimation, and hypothesis testing. First, we show diffusion models achieve the optimal density estimation rate over Holder balls. This result is a consequence of our sharp characterization of minimax score estimation across all noising levels. A key contribution is our lower bound argument which involves a slight twist to a classical construction. The squared norm of the score function is the Fisher information, which is related to important functionals like mutual information and entropy. Leveraging well-known information theoretic relations (such as the I-MMSE and de Bruijn identities), we furnish estimators of these functionals by studying noised Fisher information estimation; our results establish that simple plug-in estimators can achieve parametric rates. Finally, we study the applicability of diffusion models to hypothesis testing. By aggregating information across noise scales, we demonstrate diffusion models can achieve the sharp minimax separation rate despite going against the natural intuition that noising makes discrimination harder.
Bio: Subhodh Kotekal is a Norbert Wiener Fellow in the Statistics and Data Science Center at MIT. He obtained a Ph.D. from the Department of Statistics at the University of Chicago in June 2025 and was advised by Chao Gao.



