Benign overfitting and adaptive nonparametric regression

28.11.2022, 14:00  –  Zoom
Forschungsseminar Statistik

Julien Chhor

Abstract:
Benign overfitting is a counter-intuitive phenomenon recently discovered in the deep-learning community. In specific cases, it has been experimentally observed that deep neural networks can perfectly overfit a noisy training data set, while having excellent generalization performances to predict new data points. This goes against the conventional statistical viewpoint that there should be a necessary tradeoff between bias and variance. This talk aims to understand benign overfitting in the simplified setting of nonparametric regression. We propose using local polynomials to construct an estimator of the regression function with the following two properties. First, this estimator is minimax-optimal over Hölder classes. Second, it is a continuous function interpolating the set of observations with high probability. We then propose a further overfitting estimator that attains optimality adaptively to the unknown Hölder smoothness. Our results highlight that in the nonparametric regression model, interpolation can be fundamentally decoupled from the bias-variance tradeoff.

 

Zoom link on request

zu den Veranstaltungen