Uncertainty Quantification of Multimodal Models

Research output: Chapter in Book/Report/Conference proceedingChapter

5 Downloads (Pure)

Abstract

Multimodal classification models, particularly those designed for fine-grained tasks, offer significant potential for various applications. However, their inability to effectively manage uncertainty often hinders their effectiveness. This limitation can lead to unreliable pre-dictions and suboptimal decision-making in real-world scenarios. We propose integrating conformal prediction into multimodal classification models to address this challenge. Conformal prediction is a robust technique for quantifying uncertainty by generating sets of plausible classifications for unseen data. These sets are accompanied by guaranteed confidence levels, providing a transparent assessment of the model’s pre-diction reliability. By integrating conformal prediction, our objective is to increase the reliability and trustworthiness of multimodal classification models, thereby enabling more informed decision-making in contexts where uncertainty is a significant factor.
Original languageEnglish
Title of host publicationAdvances and Trends in Artificial Intelligence. Theory and Applications
Subtitle of host publication38th International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems
PublisherSpringer, [Cham]
Pages272
Number of pages280
Publication statusPublished - 4 Jul 2025

Cite this