
- This event has passed.
Characterizations of Uniform Learnability: Vapnik–Chervonenkis, Natarajan, and Daniely–Shalev Shwartz Dimensions
September 12, 2025 @ 11:00 am - 12:00 pm
Shay Moran (Technion and Google Research)
E18-304
Event Navigation
Abstract:
One of the central goals in statistical learning theory is to understand which hypothesis classes can be learned from data. A landmark result in this direction is the Fundamental Theorem of PAC Learning, which characterizes the learnability of binary classification problems via the VC dimension, which was introduced by Vapnik and Chervonenkis in the 1960’s.
For multiclass learning — where labels take more than two values — a parallel characterization has been an open question since the 1980s. In this talk I will present our resolution: multiclass PAC learnability is exactly characterized by the DS dimension, introduced by Daniely and Shalev-Shwartz (2014).
I will also discuss the long-standing role of the Natarajan dimension (1988), which has been a key candidate for this characterization. We show that it does not characterize multiclass learnability, by constructing a non-learnable class with Natarajan dimension one. The construction highlights an intriguing connection between learning theory and topology, and builds on a deep geometric–group–theoretic result of Januszkiewicz and Świątkowski.
Finally, I will present a conjecture suggesting that the Natarajan dimension nonetheless plays a central role in multiclass PAC learning: while it does not capture learnability itself, it may characterize the optimal agnostic learning rate in this setting.
Overall, these results illustrate how fundamental questions in learning theory can lead to precise characterizations, and new links with other areas of mathematics.
Bio:
Shay Moran is an Associate Professor at the Technion, with joint appointments in the Departments of Mathematics, Computer Science, and Data and Decision Sciences. His research lies at the intersection of machine learning, mathematics, and theoretical computer science, with a focus on generalization, differential privacy, and the foundations of learning theory. Prof. Moran has resolved several long-standing open problems in learning theory and related areas, and his work has revealed deep connections between machine learning and fields such as combinatorics, topology, set theory, and game theory. His papers have appeared in top venues including COLT, NeurIPS, ICML, STOC, FOCS, JACM, and Nature Machine Intelligence, and have received multiple awards, including Best Paper Awards at FOCS 2020 and COLT 2020. He is also the recipient of an ERC Starting Grant and other competitive fellowships.