Testing exchangeability in the batch mode with e-values and Markov alternatives

Research output: Contribution to journalArticlepeer-review

Abstract

The topic of this paper is testing the assumption of exchangeability, which is the standard assumption in mainstream machine learning. The common approaches are online testing by betting (such as conformal testing) and the older batch testing using p-values (as in classical hypothesis testing). The approach of this paper is intermediate in that we are interested in batch testing by betting; as a result, p-values are replaced by e-values. As a first step in this direction, this paper concentrates on the Markov model as alternative. The null hypothesis of exchangeability is formalized as a Kolmogorov-type compression model, and the Bayes mixture of the Markov model w.r. to the uniform prior is taken as simple alternative hypothesis. Using e-values instead of p-values leads to a computationally efficient testing procedure. Two appendixes discuss connections with the algorithmic theory of randomness; in particular, the test proposed in this paper can be interpreted as a poor man's version of Kolmogorov's deficiency of randomness.
Original languageEnglish
Article number99
Number of pages27
JournalMachine Learning
Volume114
DOIs
Publication statusPublished - 21 Feb 2025

Keywords

  • testing exchangeability
  • batch compression models
  • e-values
  • algorithmic theory of randomness

Cite this