Universal algorithms for multinomial logistic regression under Kullback-Leibler game

Raisa Dzhamtyrova, Yuri Kalnishkan

Research output: Contribution to journalArticlepeer-review

79 Downloads (Pure)

Abstract

We consider the framework of competitive prediction, where one provides guarantees compared to other predictive models that are called experts. We propose a universal algorithm predicting finite-dimensional distributions, i.e. points from a simplex, under Kullback-Leibler game. In the standard framework for prediction with expert advice, the performance of the learner is measured by means of the cumulative loss. In this paper we consider a generalisation of this setting and discount losses with time. A natural choice of predictors for the probability games is a class of multinomial logistic regression functions as they output a distribution that lies inside a probability simplex. We consider the class of multinomial logistic regressions to be our experts. We provide a strategy that allows us to `track the best expert' of this type and derive the theoretical bound on the discounted loss of the strategy. We provide the kernelized version of our algorithm, which competes with a wider set of experts from Reproducing Kernel Hilbert Space (RKHS) and prove a theoretical guarantee for the kernelized strategy. We carry out experiments on three data sets and compare the cumulative losses of our algorithm and multinomial logistic regression.
Original languageEnglish
Pages (from-to)369-380
Number of pages12
JournalNeurocomputing
Volume397
Early online date28 Nov 2019
DOIs
Publication statusPublished - 15 Jul 2020

Keywords

  • online learning
  • sequential prediction
  • competitive prediction
  • universal algorithms
  • loss bounds
  • Kullback-Leibler game

Cite this