HyperAI
Home
News
Latest Papers
Tutorials
Datasets
Wiki
SOTA
LLM Models
GPU Leaderboard
Events
Search
About
English
HyperAI
Toggle sidebar
Search the site…
⌘
K
Home
SOTA
Machine Translation
Machine Translation On Aces
Machine Translation On Aces
Metrics
Score
Results
Performance results of various models on this benchmark
Columns
Model Name
Score
Paper Title
Repository
HWTSC-Teacher-Sim
19.97
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
BLEURT-20
11.9
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
UniTE-ref
15.38
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
f200spBLEU
-0.18
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
f101spBLEU
-0.33
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
YiSi-1
11.38
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
MS-COMET-QE-22
19.76
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
COMET-QE
16.8
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
metricx_xxl_DA_2019
15.24
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
metricx_xl_MQM_2020
13.08
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
metricx_xl_DA_2019
17.17
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
BERTScore
10.47
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
UniTE
14.76
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
Cross-QE
14.07
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
BLEU
-3.13
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
COMET-22
16.31
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
KG-BERTScore
17.28
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
chrF
13.57
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
UniTE-src
15.68
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
COMET-20
12.06
ACES: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics
0 of 21 row(s) selected.
Previous
Next