Evaluating Machine Translation Quality with Conformal Predictive Distributions

Patrizio Giovannotti
Proceedings of the Twelfth Symposium on Conformal and Probabilistic Prediction with Applications, PMLR 204:413-429, 2023.

Abstract

This paper presents a new approach for assessing uncertainty in machine translation by simultaneously evaluating translation quality and providing a reliable confidence score. Our approach utilizes conformal predictive distributions to produce prediction intervals with guaranteed coverage, meaning that for any given significance level $\epsilon$, we can expect the true quality score of a translation to fall out of the interval at a rate of 1 - $\epsilon$. In this paper, we demonstrate how our method outperforms a simple, but effective baseline on six different language pairs in terms of coverage and sharpness. Furthermore, we validate that our approach requires the data exchangeability assumption to hold for optimal performance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v204-giovannotti23a, title = {Evaluating Machine Translation Quality with Conformal Predictive Distributions}, author = {Giovannotti, Patrizio}, booktitle = {Proceedings of the Twelfth Symposium on Conformal and Probabilistic Prediction with Applications}, pages = {413--429}, year = {2023}, editor = {Papadopoulos, Harris and Nguyen, Khuong An and Boström, Henrik and Carlsson, Lars}, volume = {204}, series = {Proceedings of Machine Learning Research}, month = {13--15 Sep}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v204/giovannotti23a/giovannotti23a.pdf}, url = {https://proceedings.mlr.press/v204/giovannotti23a.html}, abstract = {This paper presents a new approach for assessing uncertainty in machine translation by simultaneously evaluating translation quality and providing a reliable confidence score. Our approach utilizes conformal predictive distributions to produce prediction intervals with guaranteed coverage, meaning that for any given significance level $\epsilon$, we can expect the true quality score of a translation to fall out of the interval at a rate of 1 - $\epsilon$. In this paper, we demonstrate how our method outperforms a simple, but effective baseline on six different language pairs in terms of coverage and sharpness. Furthermore, we validate that our approach requires the data exchangeability assumption to hold for optimal performance.} }
Endnote
%0 Conference Paper %T Evaluating Machine Translation Quality with Conformal Predictive Distributions %A Patrizio Giovannotti %B Proceedings of the Twelfth Symposium on Conformal and Probabilistic Prediction with Applications %C Proceedings of Machine Learning Research %D 2023 %E Harris Papadopoulos %E Khuong An Nguyen %E Henrik Boström %E Lars Carlsson %F pmlr-v204-giovannotti23a %I PMLR %P 413--429 %U https://proceedings.mlr.press/v204/giovannotti23a.html %V 204 %X This paper presents a new approach for assessing uncertainty in machine translation by simultaneously evaluating translation quality and providing a reliable confidence score. Our approach utilizes conformal predictive distributions to produce prediction intervals with guaranteed coverage, meaning that for any given significance level $\epsilon$, we can expect the true quality score of a translation to fall out of the interval at a rate of 1 - $\epsilon$. In this paper, we demonstrate how our method outperforms a simple, but effective baseline on six different language pairs in terms of coverage and sharpness. Furthermore, we validate that our approach requires the data exchangeability assumption to hold for optimal performance.
APA
Giovannotti, P.. (2023). Evaluating Machine Translation Quality with Conformal Predictive Distributions. Proceedings of the Twelfth Symposium on Conformal and Probabilistic Prediction with Applications, in Proceedings of Machine Learning Research 204:413-429 Available from https://proceedings.mlr.press/v204/giovannotti23a.html.

Related Material