Hedging predictions in machine learning. / Gammerman, Alexander; Vovk, Vladimir.

2006.

Research output: Working paper

Published

Standard

Harvard

APA

Vancouver

Author

BibTeX

@techreport{d00c66de27d3469db9236cd1d579bb0e,
title = "Hedging predictions in machine learning",
abstract = "Recent advances in machine learning make it possible to design efficient prediction algorithms for data sets with huge numbers of parameters. This paper describes a new technique for {"}hedging{"} the predictions output by many such algorithms, including support vector machines, kernel ridge regression, kernel nearest neighbours, and by many other state-of-the-art methods. The hedged predictions for the labels of new objects include quantitative measures of their own accuracy and reliability. These measures are provably valid under the assumption of randomness, traditional in machine learning: the objects and their labels are assumed to be generated independently from the same probability distribution. In particular, it becomes possible to control (up to statistical fluctuations) the number of erroneous predictions by selecting a suitable confidence level. Validity being achieved automatically, the remaining goal of hedged prediction is efficiency: taking full account of the new objects' features and other available information to produce as accurate predictions as possible. This can be done successfully using the powerful machinery of modern machine learning.",
keywords = "cs.LG",
author = "Alexander Gammerman and Vladimir Vovk",
note = "24 pages; 9 figures; 2 tables; a version of this paper (with discussion and rejoinder) publiseshed in {"}Computer Journal{"}",
year = "2006",
month = nov,
day = "2",
language = "English",
type = "WorkingPaper",

}

RIS

TY - UNPB

T1 - Hedging predictions in machine learning

AU - Gammerman, Alexander

AU - Vovk, Vladimir

N1 - 24 pages; 9 figures; 2 tables; a version of this paper (with discussion and rejoinder) publiseshed in "Computer Journal"

PY - 2006/11/2

Y1 - 2006/11/2

N2 - Recent advances in machine learning make it possible to design efficient prediction algorithms for data sets with huge numbers of parameters. This paper describes a new technique for "hedging" the predictions output by many such algorithms, including support vector machines, kernel ridge regression, kernel nearest neighbours, and by many other state-of-the-art methods. The hedged predictions for the labels of new objects include quantitative measures of their own accuracy and reliability. These measures are provably valid under the assumption of randomness, traditional in machine learning: the objects and their labels are assumed to be generated independently from the same probability distribution. In particular, it becomes possible to control (up to statistical fluctuations) the number of erroneous predictions by selecting a suitable confidence level. Validity being achieved automatically, the remaining goal of hedged prediction is efficiency: taking full account of the new objects' features and other available information to produce as accurate predictions as possible. This can be done successfully using the powerful machinery of modern machine learning.

AB - Recent advances in machine learning make it possible to design efficient prediction algorithms for data sets with huge numbers of parameters. This paper describes a new technique for "hedging" the predictions output by many such algorithms, including support vector machines, kernel ridge regression, kernel nearest neighbours, and by many other state-of-the-art methods. The hedged predictions for the labels of new objects include quantitative measures of their own accuracy and reliability. These measures are provably valid under the assumption of randomness, traditional in machine learning: the objects and their labels are assumed to be generated independently from the same probability distribution. In particular, it becomes possible to control (up to statistical fluctuations) the number of erroneous predictions by selecting a suitable confidence level. Validity being achieved automatically, the remaining goal of hedged prediction is efficiency: taking full account of the new objects' features and other available information to produce as accurate predictions as possible. This can be done successfully using the powerful machinery of modern machine learning.

KW - cs.LG

M3 - Working paper

BT - Hedging predictions in machine learning

ER -