MAchine Translation Evaluation Online

Start date
Aug. 1, 2022
End date
July 31, 2023
Research portal


Machine translation is all around us, and to push its quality forward a variety of automatic metrics exists to compare the quality of machine translations with reference translations. Yet, an accessible, unified approach to use these evaluation metrics is missing. With our proposal, MATEO: Machine Translation Evaluation Online, we bridge this gap. We prioritize technical and non-technical users by open-sourcing both an advanced Python tool and a user-friendly web interface with ample visualisations and options. The project seeks to support researchers of SSH (Social Sciences and Humanities) and beyond in their MT-related research endeavours. Additionally, it will improve digital literacy of non-expert users by allowing them to easily evaluate machine-generated translations. The interface will be incorporated in the CLARIN B Center of INT-NL.