- This event has passed.
On the power of Lenstra-Lenstra-Lovasz in noiseless inference
February 18, 2022 @ 11:00 am - 12:00 pm
Ilias Zadik (MIT)
E18-304
Event Navigation
Abstract: In this talk, we are going to discuss a new polynomial-time algorithmic framework for inference problems, based on the celebrated Lenstra-Lenstra-Lovasz lattice basis reduction algorithm. Potentially surprisingly, this algorithmic framework is able to successfully bypass multiple suggested notions of “computational hardness for inference” for various noiseless settings. Such settings include 1) sparse regression, where there is Overlap Gap Property and low-degree methods fail, 2) phase retrieval where Approximate Message Passing fails and 3) Gaussian clustering where the SoS hierarchy fails. In particular, our results, similar to the folklore but specific Gaussian elimination application in random XORSAT, highlight the crucial but subtle role of noise in the onset of statistical-to-computational gaps.
This is based on joint works with David Gamarnik, Eren Kizildag, Min Jae Song, Joan Bruna and Alex Wein.
–
Bio: Ilias Zadik is a postdoctoral researcher at the Mathematics Department of MIT, working with Professor Elchanan Mossel and Professor Nike Sun. Previously, he spent two years as a postdoctoral fellow at the Center of Data Science of NYU, and before that he received his PhD in 2019 from the Operations Research Center of MIT under the supervision of Professor David Gamarnik. His research lies broadly on the interface of high-dimensional statistics, the theory of machine learning and applied probability. He is especially interested in understanding the statistical and computational trade-offs appearing in high dimensional inference.