Abstract
Deep Learning predictions with measurable confidence are increasinglydesirable for real-world problems, especially in high-risk settings. TheConformal Prediction (CP) framework is a versatile solution that guaran-tees a maximum error rate given minimal constraints. In this paper, wepropose a novel conformal loss function that approximates the tradition-ally two-step CP approach in a single step. By evaluating and penalisingdeviations from the stringent expected CP output distribution, a DeepLearning model may learn the direct relationship between the input dataand the conformal p-values.We carry out a comprehensive empirical eval-uation to show our novel loss function's competitiveness for seven binaryand multi-class prediction tasks on five benchmark datasets. On the samedatasets, our approach achieves significant training time reductions upto 86% compared to Aggregated Conformal Prediction (ACP), whilemaintaining comparable approximate validity and predictive efficiency.
Original language | English |
---|---|
Journal | Annals of Mathematics and Artificial Intelligence |
DOIs | |
Publication status | Published - 1 Jul 2023 |
Keywords
- Prediction confidence
- Deep Learning
- Conformal Prediction