• This event has passed.
Stochastics and Statistics Seminar Series

# Optimal testing for calibration of predictive models

## March 4, 2022 @ 11:00 am - 12:00 pm

Edgar Dobriban (University of Pennsylvania)

E18-304

We find that detecting mis-calibration is only possible when the conditional probabilities of the classes are sufficiently smooth functions of the predictions.  When the conditional class probabilities are H\”older continuous, we propose a minimax optimal test for calibration based on a debiased plug-in estimator of the $\ell_2$-Expected Calibration Error (ECE).  We further propose a version that is adaptive to unknown smoothness.  We verify our theoretical findings with a broad range of experiments, including with several popular deep neural net architectures and several standard post-hoc calibration methods. Our algorithm is a general-purpose tool, which—combined with classical tests for calibration of discrete-valued predictors—can be used to test the calibration of virtually any classification method.