TY - UNPB
T1 - Evaluating Machine Translation Quality with Conformal Predictive Distributions
AU - Giovannotti, Patrizio
PY - 2023
Y1 - 2023
N2 - This paper presents a new approach for assessing uncertainty in machine translation by simultaneously evaluating translation quality and providing a reliable confidence score. Our approach utilizes conformal predictive distributions to produce prediction intervals with guaranteed coverage, meaning that for any given significance level ε, we can expect the true quality score of a translation to fall out of the interval at a rate of 1-ε. In this paper, we demonstrate how our method outperforms a simple, but effective baseline on six different language pairs in terms of coverage and sharpness. Furthermore, we validate that our approach requires the data exchangeability assumption to hold for optimal performance.
AB - This paper presents a new approach for assessing uncertainty in machine translation by simultaneously evaluating translation quality and providing a reliable confidence score. Our approach utilizes conformal predictive distributions to produce prediction intervals with guaranteed coverage, meaning that for any given significance level ε, we can expect the true quality score of a translation to fall out of the interval at a rate of 1-ε. In this paper, we demonstrate how our method outperforms a simple, but effective baseline on six different language pairs in terms of coverage and sharpness. Furthermore, we validate that our approach requires the data exchangeability assumption to hold for optimal performance.
U2 - 10.48550/arXiv.2306.01549
DO - 10.48550/arXiv.2306.01549
M3 - Preprint
BT - Evaluating Machine Translation Quality with Conformal Predictive Distributions
ER -