Accéder directement au contenu Accéder directement à la navigation
Article dans une revue

Inverse statistical learning

Abstract :

Let (X,Y)∈X×Y be a random couple with unknown distribution P. Let G be a class of measurable functions and ℓ a loss function. The problem of statistical learning deals with the estimation of the Bayes:

g∗=arg ming∈ GEPℓ(g,(X,Y)).


In this paper, we study this problem when we deal with a contaminated sample (Z1,Y1),…,(Zn,Yn) of i.i.d. indirect observations. Each input Zi, i=1,…,n is distributed from a density Af, where A is a known compact linear operator and f is the density of the direct input X.

We derive fast rates of convergence for the excess risk of empirical risk minimizers based on regularization methods, such as deconvolution kernel density estimators or spectral cut-off. These results are comparable to the existing fast rates in Koltchinskii (2006) for the direct case. It gives some insights into the effect of indirect measurements in the presence of fast rates of convergence.

Type de document :
Article dans une revue
Liste complète des métadonnées

https://hal.univ-angers.fr/hal-03038376
Contributeur : Okina Université d'Angers <>
Soumis le : jeudi 3 décembre 2020 - 14:33:13
Dernière modification le : vendredi 4 décembre 2020 - 03:22:51

Lien texte intégral

Identifiants

Collections

Citation

Sébastien Loustau. Inverse statistical learning. Electronic Journal of Statistics, 2013, 7, pp.2065-2097. ⟨10.1214/13-EJS838⟩. ⟨hal-03038376⟩

Partager

Métriques

Consultations de la notice

8