Abstract: Shape constraints such as monotonicity, convexity, and log-concavity are naturally motivated in many applications, and can offer attractive alternatives to more traditional smoothness constraints in nonparametric estimation. In this talk we present some recent results on shape constrained estimation in high and low dimensions. First, we show how shape constrained additive models can be used to select variables in a sparse convex regression function. In contrast, additive models generally fail for variable selection under smoothness constraints. Next, we introduce graph-structured extensions of isotonic regression, based on the notion of a flow on rooted trees. Finally, we describe how the least squares estimator for unimodal sequences is adaptive, building on recent adaptivity results for other shape constrained problems. We also discuss some open problems in shape constrained estimation and inference. Joint work with Min Xu (Wharton School) and Sabyasachi Chatterjee (University of Chicago).
Bio: John Lafferty is a Professor of Statistics in the Department of Statistics and Department of Computer Science at the University of Chicago.