Abstract
Multimodal classification models, particularly those designed for fine-grained tasks, offer significant potential for various applications. However, their inability to effectively manage uncertainty often hinders their effectiveness. This limitation can lead to unreliable pre-dictions and suboptimal decision-making in real-world scenarios. We propose integrating conformal prediction into multimodal classification models to address this challenge. Conformal prediction is a robust technique for quantifying uncertainty by generating sets of plausible classifications for unseen data. These sets are accompanied by guaranteed confidence levels, providing a transparent assessment of the model’s pre-diction reliability. By integrating conformal prediction, our objective is to increase the reliability and trustworthiness of multimodal classification models, thereby enabling more informed decision-making in contexts where uncertainty is a significant factor.
| Original language | English |
|---|---|
| Title of host publication | Advances and Trends in Artificial Intelligence. Theory and Applications |
| Subtitle of host publication | 38th International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems |
| Publisher | Springer, [Cham] |
| Pages | 272 |
| Number of pages | 280 |
| Publication status | Published - 4 Jul 2025 |