Projects per year
Abstract
Predicting the future is an important purpose of machine learning research. In
online learning, predictions are given sequentially rather than all at once. People
wish to make sensible decisions sequentially in many situations of everyday
life, whether monthbymonth, daybyday, or minutebyminute.
In competitive prediction, the predictions are made by a set of experts and
by a learner. The quality of the predictions is measured by a loss function.
The goal of the learner is to make reliable predictions under any circumstances.
The learner compares his loss with the loss of the best experts from the set
and ensures that his performance is not much worse.
In this thesis a general methodology is described to provide algorithms with
strong performance guarantees for many prediction problems. Specific attention
is paid to the square loss function, widely used to assess the quality of
predictions. Four types of the sets of experts are considered in this thesis: sets
with finite number of free experts (which are not required to follow any strategy),
sets of experts following strategies from finitedimensional spaces, sets of
experts following strategies from infinitedimensional Hilbert spaces, and sets
of experts following strategies from infinitedimensional Banach spaces. The
power of the methodology is illustrated in the derivations of various prediction
algorithms.
Two core approaches are explored in this thesis: the Aggregating Algorithm
and Defensive Forecasting. These approaches are close to each other in many
interesting cases. However, Defensive Forecasting is more general and covers
some problems which cannot be solved using the Aggregating Algorithm. The
Aggregating Algorithm is more specific and is often more computationally
efficient.
The empirical performance and properties of new algorithms are validated
on artificial or real world data sets. Specific areas where the algorithms can
be applied are emphasized.
online learning, predictions are given sequentially rather than all at once. People
wish to make sensible decisions sequentially in many situations of everyday
life, whether monthbymonth, daybyday, or minutebyminute.
In competitive prediction, the predictions are made by a set of experts and
by a learner. The quality of the predictions is measured by a loss function.
The goal of the learner is to make reliable predictions under any circumstances.
The learner compares his loss with the loss of the best experts from the set
and ensures that his performance is not much worse.
In this thesis a general methodology is described to provide algorithms with
strong performance guarantees for many prediction problems. Specific attention
is paid to the square loss function, widely used to assess the quality of
predictions. Four types of the sets of experts are considered in this thesis: sets
with finite number of free experts (which are not required to follow any strategy),
sets of experts following strategies from finitedimensional spaces, sets of
experts following strategies from infinitedimensional Hilbert spaces, and sets
of experts following strategies from infinitedimensional Banach spaces. The
power of the methodology is illustrated in the derivations of various prediction
algorithms.
Two core approaches are explored in this thesis: the Aggregating Algorithm
and Defensive Forecasting. These approaches are close to each other in many
interesting cases. However, Defensive Forecasting is more general and covers
some problems which cannot be solved using the Aggregating Algorithm. The
Aggregating Algorithm is more specific and is often more computationally
efficient.
The empirical performance and properties of new algorithms are validated
on artificial or real world data sets. Specific areas where the algorithms can
be applied are emphasized.
Original language  English 

Qualification  PhD 
Supervisors/Advisors 

Award date  1 Jan 2011 
Publication status  Unpublished  2011 
Projects
 1 Finished

PCP: Practical Competitive Prediction
Vovk, V., Gammerman, A., Kalnishkan, Y., Chernov, A. & Zhdanov, F.
Eng & Phys Sci Res Council EPSRC
10/10/07 → 22/11/10
Project: Research