Minimax Adaptive Boosting for Online Nonparametric Regression - Apprentissage de modèles visuels à partir de données massives
Pré-Publication, Document De Travail Année : 2024

Minimax Adaptive Boosting for Online Nonparametric Regression

Résumé

We study boosting for adversarial online nonparametric regression with general convex losses. We first introduce a parameter-free online gradient boosting (OGB) algorithm and show that its application to chaining trees achieves minimax optimal regret when competing against Lipschitz functions. While competing with nonparametric function classes can be challenging, the latter often exhibit local patterns, such as local Lipschitzness, that online algorithms can exploit to improve performance. By applying OGB over a core tree based on chaining trees, our proposed method effectively competes against all prunings that align with different Lipschitz profiles and demonstrates optimal dependence on the local regularities. As a result, we obtain the first computationally efficient algorithm with locally adaptive optimal rates for online regression in an adversarial setting.
Fichier principal
Vignette du fichier
main.pdf (405.85 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04719933 , version 1 (03-10-2024)

Licence

Identifiants

Citer

Paul Liautaud, Pierre Gaillard, Olivier Wintenberger. Minimax Adaptive Boosting for Online Nonparametric Regression. 2024. ⟨hal-04719933⟩
93 Consultations
51 Téléchargements

Altmetric

Partager

More