Competitive on-line learning with a convex loss function. / Vovk, Vladimir.

2005.

Research output: Working paper

Published

Documents

  • pdf

    283 KB, PDF document

Abstract

We consider the problem of sequential decision making under uncertainty in which the loss caused by a decision depends on the following binary observation. In competitive on-line learning, the goal is to design decision algorithms that are almost as good as the best decision rules in a wide benchmark class, without making any assumptions about the way the observations are generated. However, standard algorithms in this area can only deal with finite-dimensional (often countable) benchmark classes. In this paper we give similar results for decision rules ranging over an arbitrary reproducing kernel Hilbert space. For example, it is shown that for a wide class of loss functions (including the standard square, absolute, and log loss functions) the averageloss of the master algorithm, over the first $N$ observations, does not exceed the average loss of the best decision rule with a bounded norm plus $O(N^{-1/2})$. Our proof technique is very different from the standard ones and is based on recent results about defensive forecasting. Given the probabilities produced by a defensive forecasting algorithm, which are known to be well calibrated and to have good resolution in the long run, we use the expected loss minimization principle to find a suitable decision.
Original languageEnglish
Publication statusPublished - 11 Jun 2005
This open access research output is licenced under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

ID: 935356