Randomness, exchangeability, and conformal prediction

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

This paper argues for a wider use of the functional theory of randomness, a modification of the algorithmic theory of randomness getting rid of unspecified additive constants. Both theories are useful for understanding relations between the assumptions of IID data and data exchangeability. While the assumption of IID data is standard in machine learning, conformal prediction relies on the weaker assumption of data exchangeability. Nouretdinov, V'yugin, and Gammerman showed, using the language of the algorithmic theory of randomness, that conformal prediction is a universal method under the assumption of IID data. In this paper, I will selectively review connections between exchangeability and the property of being IID, early history of conformal prediction, my encounters and collaboration with Alex and other interesting people, and a translation of Nouretdinov et al.'s results into the language of the functional theory of randomness, which moves it closer to practice. Namely, the translation says that every confidence predictor that is valid for IID data can be converted into a conformal predictor without losing much in predictive efficiency.
Original languageEnglish
Title of host publicationThe Importance of Being Learnable
Subtitle of host publicationEssays Dedicated to Alexander Gammerman
EditorsKhuong Nguyen, Zhiyuan Luo
Place of PublicationCham, Switzerland
PublisherSpringer
Pages87-117
Number of pages31
ISBN (Electronic)978-3-032-15120-9
ISBN (Print)978-3-032-15119-3
Publication statusPublished - 2026

Publication series

NameLecture Notes in Computer Science
Volume16290

Keywords

  • conformal prediction
  • universality of conformal prediction
  • fundamental limitation of conformal prediction
  • functional theory of randomness
  • IID
  • exchangeability
  • p-values
  • e-values

Cite this