TITLE: Confidence Intervals for High-Dimensional Linear Regression: Minimax Rates and Adaptivity
ABSTRACT: Confidence sets play a fundamental role in statistical inference. In this paper, we consider confidence intervals for high dimensional linear regression with random design. We first establish the convergence rates of the minimax expected length for confidence intervals in the oracle setting where the sparsity parameter is given. The focus is then on the problem of adaptation to sparsity for the construction of confidence intervals. Ideally, an adaptive confidence interval should have its length automatically adjusted to the sparsity of the unknown regression vector, while maintaining a prespecified coverage probability. It is shown that such a goal is in general not attainable, except when the sparsity parameter is restricted to a small region over which the confidence intervals have the optimal length of the usual parametric rate. It is further demonstrated that the lack of adaptivity is not due to the conservativeness of the minimax framework, but is fundamentally caused by the difficulty of learning the bias accurately.
BIO: Tony Cai received his Ph.D. in Mathematics from Cornell University in 1996, and is currently the Dorothy Silberberg Professor of Statistics at the Wharton School, a professor in the Applied Mathematics and Computational Science Graduate Group, and an Associate Scholar of the Department of Biostatistics and Epidemiology in the Perelman School of Medicine at the University of Pennsylvania. He is the recipient of the 2008 COPSS Presidents’ Award, a Fellow and a Medallion Lecturer of the Institute of Mathematical Statistics. He is an associate editor of the Journal of the Royal Statistical Society, Series B, and a former Editor of the Annals of Statistics. His research interests include high dimensional statistics, functional data analysis, large-scale multiple testing, nonparametric function estimation, and statistical decision theory.