Alain Célisse (SAMM, Paris 1), le 25 septembre 2020

Analyzing the discrepancy principle for kernelized spectral filter algorithms
vendredi 25 septembre 2020

In this work, we investigate the construction of early stopping rules in the nonparametric regression context where iterative learning algorithms are used and the optimal iteration number is unknown. More precisely, we study the discrepancy principle, as well as modifications based on smoothed residuals, for kernelized spectral filter learning algorithms including gradient descent.
Our main theoretical bounds are oracle inequalities established for the empirical estimation error (fixed design), and for the prediction error (random design). From these finite-sample bounds it follows that the classical discrepancy principle is statistically adaptive for slow rates occurring in the hard learning scenario, while the smoothed discrepancy principles are adaptive over ranges of faster rates (resp. higher smoothness parameters).