Greedy algorithms for prediction. / Sancetta, Alessio.

In: Bernoulli, Vol. 22, No. 2, 01.05.2016, p. 1227-1277.

Research output: Contribution to journalArticlepeer-review

Published

Standard

Greedy algorithms for prediction. / Sancetta, Alessio.

In: Bernoulli, Vol. 22, No. 2, 01.05.2016, p. 1227-1277.

Research output: Contribution to journalArticlepeer-review

Harvard

Sancetta, A 2016, 'Greedy algorithms for prediction', Bernoulli, vol. 22, no. 2, pp. 1227-1277. https://doi.org/10.3150/14-BEJ691

APA

Vancouver

Author

Sancetta, Alessio. / Greedy algorithms for prediction. In: Bernoulli. 2016 ; Vol. 22, No. 2. pp. 1227-1277.

BibTeX

@article{41d03b18ffd64ec6ac366066346b8b59,
title = "Greedy algorithms for prediction",
abstract = "In many prediction problems, it is not uncommon that the number of variables used to construct a forecast is of the same order of magnitude as the sample size, if not larger. We then face the problem of constructing a prediction in the presence of potentially large estimation error. Control of the estimation error is either achieved by selecting variables or combining all the variables in some special way. This paper considers greedy algorithms to solve this problem. It is shown that the resulting estimators are consistent under weak conditions. In particular, the derived rates of convergence are either minimax or improve on the ones given in the literature allowing for dependence and unbounded regressors. Some versions of the algorithms provide fast solution to problems such as Lasso.",
author = "Alessio Sancetta",
year = "2016",
month = may,
day = "1",
doi = "10.3150/14-BEJ691",
language = "English",
volume = "22",
pages = "1227--1277",
journal = "Bernoulli",
issn = "1350-7265",
publisher = "International Statistical Institute",
number = "2",

}

RIS

TY - JOUR

T1 - Greedy algorithms for prediction

AU - Sancetta, Alessio

PY - 2016/5/1

Y1 - 2016/5/1

N2 - In many prediction problems, it is not uncommon that the number of variables used to construct a forecast is of the same order of magnitude as the sample size, if not larger. We then face the problem of constructing a prediction in the presence of potentially large estimation error. Control of the estimation error is either achieved by selecting variables or combining all the variables in some special way. This paper considers greedy algorithms to solve this problem. It is shown that the resulting estimators are consistent under weak conditions. In particular, the derived rates of convergence are either minimax or improve on the ones given in the literature allowing for dependence and unbounded regressors. Some versions of the algorithms provide fast solution to problems such as Lasso.

AB - In many prediction problems, it is not uncommon that the number of variables used to construct a forecast is of the same order of magnitude as the sample size, if not larger. We then face the problem of constructing a prediction in the presence of potentially large estimation error. Control of the estimation error is either achieved by selecting variables or combining all the variables in some special way. This paper considers greedy algorithms to solve this problem. It is shown that the resulting estimators are consistent under weak conditions. In particular, the derived rates of convergence are either minimax or improve on the ones given in the literature allowing for dependence and unbounded regressors. Some versions of the algorithms provide fast solution to problems such as Lasso.

U2 - 10.3150/14-BEJ691

DO - 10.3150/14-BEJ691

M3 - Article

VL - 22

SP - 1227

EP - 1277

JO - Bernoulli

JF - Bernoulli

SN - 1350-7265

IS - 2

ER -