Hostname: page-component-8448b6f56d-c47g7 Total loading time: 0 Render date: 2024-04-20T03:02:48.779Z Has data issue: false hasContentIssue false

Model selection for (auto-)regression with dependent data

Published online by Cambridge University Press:  15 August 2002

Yannick Baraud
Affiliation:
École Normale Supérieure, DMA, 45 rue d'Ulm, 75230 Paris Cedex 05, France; Yannick.Baraud@ens.fr.
F. Comte
Affiliation:
Laboratoire de Probabilités et Modèles Aléatoires, Boîte 188, Université Paris 6, 4 place Jussieu, 75252 Paris Cedex 05, France.
G. Viennet
Affiliation:
Laboratoire de Probabilités et Modèles Aléatoires, Boîte 7012, Université Paris 7, 2 place Jussieu, 75251 Paris Cedex 05, France.
Get access

Abstract

In this paper, we study the problem of non parametric estimation of an unknown regression function from dependent data with sub-Gaussian errors. As a particular case, we handle the autoregressive framework. For this purpose, we consider a collection of finite dimensional linear spaces (e.g. linear spaces spanned by wavelets or piecewise polynomials on a possibly irregular grid) and we estimate the regression function by a least-squares estimator built on a data driven selected linear space among the collection. This data driven choice is performed via the minimization of a penalized criterion akin to the Mallows' Cp. We state non asymptotic risk bounds for our estimator in some ${\mathbb{L}}_2$-norm and we show that it is adaptive in the minimax sense over a large class of Besov balls of the form Bα,p,∞(R) with p ≥ 1.

Type
Research Article
Copyright
© EDP Sciences, SMAI, 2001

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

H. Akaike, Information theory and an extension of the maximum likelihood principle, in Proc. 2nd International Symposium on Information Theory, edited by P.N. Petrov and F. Csaki. Akademia Kiado, Budapest (1973) 267-281.
Akaike, H., A new look at the statistical model identification. IEEE Trans. Automat. Control 19 (1984) 716-723. CrossRef
Ango Nze, P., Geometric and subgeometric rates for markovian processes in the neighbourhood of linearity. C. R. Acad. Sci. Paris 326 (1998) 371-376. CrossRef
Baraud, Y., Model selection for regression on a fixed design. Probab. Theory Related Fields 117 (2000) 467-493. CrossRef
Y. Baraud, Model selection for regression on a random design, Preprint 01-10. DMA, École Normale Supérieure (2001).
Y. Baraud, F. Comte and G. Viennet, Adaptive estimation in autoregression or β-mixing regression via model selection. Ann. Statist. (to appear).
Barron, A., Birgé, L. and Massart, P., Risks bounds for model selection via penalization. Probab. Theory Related Fields 113 (1999) 301-413. CrossRef
Birgé, L. and Massart, P., An adaptive compression algorithm in Besov spaces. Constr. Approx. 16 (2000) 1-36. CrossRef
L. Birgé and Y. Rozenholc, How many bins must be put in a regular histogram. Working paper (2001).
Cohen, A., Daubechies, I. and Vial, P., Wavelet and fast wavelet transform on an interval. Appl. Comput. Harmon. Anal. 1 (1993) 54-81. CrossRef
I. Daubechies, Ten lectures on wavelets. SIAM: Philadelphia (1992).
R.A. Devore and C.G. Lorentz, Constructive Approximation. Springer-Verlag (1993).
Donoho, D.L. and Johnstone, I.M., Minimax estimation via wavelet shrinkage. Ann. Statist. 26 (1998) 879-921.
P. Doukhan, Mixing properties and examples. Springer-Verlag (1994).
M. Duflo, Random Iterative Models. Springer, Berlin, New-York (1997).
Hoffmann, M., On nonparametric estimation in nonlinear AR(1)-models. Statist. Probab. Lett. 44 (1999) 29-45. CrossRef
Ibragimov, I.A., On the spectrum of stationary Gaussian sequences satisfying the strong mixing condition I: Necessary conditions. Theory Probab. Appl. 10 (1965) 85-106. CrossRef
M. Kohler, On optimal rates of convergence for nonparametric regression with random design, Working Paper. Stuttgart University (1997).
Kolmogorov, A.R. and Rozanov, Y.A., On the strong mixing conditions for stationary Gaussian sequences. Theory Probab. Appl. 5 (1960) 204-207. CrossRef
Asymptotic, K.C. Li optimality for C p , C l cross-validation and generalized cross-validation: Discrete index set. Ann. Statist. 15 (1987) 958-975.
G.G. Lorentz, M. von Golitschek and Y. Makokov, Constructive Approximation, Advanced Problems. Springer, Berlin (1996).
Mallows, C.L., Some comments on C p . Technometrics 15 (1973) 661-675.
A. Meyer, Quelques inégalités sur les martingales d'après Dubins et Freedman, Séminaire de Probabilités de l'Université de Strasbourg. Vols. 68/69 (1969) 162-169.
Modha, D.S. and Masry, E., Minimum complexity regression estimation with weakly dependent observations. IEEE Trans. Inform. Theory 42 (1996) 2133-2145. CrossRef
Modha, D.S. and Masry, E., Memory-universal prediction of stationary random processes. IEEE Trans. Inform. Theory 44 (1998) 117-133. CrossRef
Neumann, M. and Kreiss, J.-P., Regression-type inference in nonparametric autoregression. Ann. Statist. 26 (1998) 1570-1613.
Polyak, B.T. and Tsybakov, A., A family of asymptotically optimal methods for choosing the order of a projective regression estimate. Theory Probab. Appl. 37 (1992) 471-481. CrossRef
Shibata, R., Selection of the order of an autoregressive model by Akaike's information criterion. Biometrika 63 (1976) 117-126. CrossRef
Shibata, R., An optimal selection of regression variables. Biometrika 68 (1981) 45-54. CrossRef
Van de Geer, S., Exponential inequalities for martingales, with application to maximum likelihood estimation for counting processes. Ann. Statist. 23 (1995) 1779-1801. CrossRef
Volonskii, V.A. and Rozanov, Y.A., Some limit theorems for random functions. I. Theory Probab. Appl. 4 (1959) 179-197.