**Conformal and Venn Predictors for Multi-probabilistic Predictions and Their Applications.** / Zhou, Chenzhe.

Research output: Thesis › Doctoral Thesis

Unpublished

**Conformal and Venn Predictors for Multi-probabilistic Predictions and Their Applications.** / Zhou, Chenzhe.

Research output: Thesis › Doctoral Thesis

Zhou, C 2015, 'Conformal and Venn Predictors for Multi-probabilistic Predictions and Their Applications', Ph.D., Royal Holloway, University of London.

Zhou, C. (2015). *Conformal and Venn Predictors for Multi-probabilistic Predictions and Their Applications*.

Zhou C. Conformal and Venn Predictors for Multi-probabilistic Predictions and Their Applications. 2015.

@phdthesis{590fadf0525542a699d284b82e5b8355,

title = "Conformal and Venn Predictors for Multi-probabilistic Predictions and Their Applications",

abstract = "In machine learning, a typical algorithm gives predictions for the unknown objects based on known properties learned from the training data set. However, most algorithms can only give a single prediction (for classification, it is predicted label; for regression, it is predicted value). A demand for probabilistic prediction has risen in view of the fact that a prediction with its complementary probabilistic estimation is more informative than a single prediction. An example is the probabilistic weather forecast. We prefer hearing that tomorrow has a chance of 60% to be rainy rather than there will be rain tomorrow. However, in most areas, single probabilistic prediction is still not enough. True probability could be either higher or lower than its estimation. If we make multiple probabilistic predictions that can hedge the true probability within an interval, we could have a better estimation. The term multi-probability is brought to mind, which means that we announce several probability distributions for the new label rather than a single one. In this thesis, we propose several novel designs of Venn predictors and Conformal predictors that provide multi-probabilistic predictions together with single predictions. These implementations are based on k-Nearest Neighbours, Support Vector Machines and Crammer and Singer{\textquoteright}s Multi-Class Support Vector Machines. These algorithms could give high accuracy together with probabilistic predictions. Experimental testing is carried out. We then compare these algorithms to some other algorithms for probabilistic predictions. The results demonstrate the advantages of applying these algorithms.",

author = "Chenzhe Zhou",

year = "2015",

language = "English",

school = "Royal Holloway, University of London",

}

TY - THES

T1 - Conformal and Venn Predictors for Multi-probabilistic Predictions and Their Applications

AU - Zhou, Chenzhe

PY - 2015

Y1 - 2015

N2 - In machine learning, a typical algorithm gives predictions for the unknown objects based on known properties learned from the training data set. However, most algorithms can only give a single prediction (for classification, it is predicted label; for regression, it is predicted value). A demand for probabilistic prediction has risen in view of the fact that a prediction with its complementary probabilistic estimation is more informative than a single prediction. An example is the probabilistic weather forecast. We prefer hearing that tomorrow has a chance of 60% to be rainy rather than there will be rain tomorrow. However, in most areas, single probabilistic prediction is still not enough. True probability could be either higher or lower than its estimation. If we make multiple probabilistic predictions that can hedge the true probability within an interval, we could have a better estimation. The term multi-probability is brought to mind, which means that we announce several probability distributions for the new label rather than a single one. In this thesis, we propose several novel designs of Venn predictors and Conformal predictors that provide multi-probabilistic predictions together with single predictions. These implementations are based on k-Nearest Neighbours, Support Vector Machines and Crammer and Singer’s Multi-Class Support Vector Machines. These algorithms could give high accuracy together with probabilistic predictions. Experimental testing is carried out. We then compare these algorithms to some other algorithms for probabilistic predictions. The results demonstrate the advantages of applying these algorithms.

AB - In machine learning, a typical algorithm gives predictions for the unknown objects based on known properties learned from the training data set. However, most algorithms can only give a single prediction (for classification, it is predicted label; for regression, it is predicted value). A demand for probabilistic prediction has risen in view of the fact that a prediction with its complementary probabilistic estimation is more informative than a single prediction. An example is the probabilistic weather forecast. We prefer hearing that tomorrow has a chance of 60% to be rainy rather than there will be rain tomorrow. However, in most areas, single probabilistic prediction is still not enough. True probability could be either higher or lower than its estimation. If we make multiple probabilistic predictions that can hedge the true probability within an interval, we could have a better estimation. The term multi-probability is brought to mind, which means that we announce several probability distributions for the new label rather than a single one. In this thesis, we propose several novel designs of Venn predictors and Conformal predictors that provide multi-probabilistic predictions together with single predictions. These implementations are based on k-Nearest Neighbours, Support Vector Machines and Crammer and Singer’s Multi-Class Support Vector Machines. These algorithms could give high accuracy together with probabilistic predictions. Experimental testing is carried out. We then compare these algorithms to some other algorithms for probabilistic predictions. The results demonstrate the advantages of applying these algorithms.

M3 - Doctoral Thesis

ER -