Kg To Text Generation On Webnlg Unseen
Metriken
BLEU
METEOR
chrF++
Ergebnisse
Leistungsergebnisse verschiedener Modelle zu diesem Benchmark
Modellname | BLEU | METEOR | chrF++ | Paper Title | Repository |
---|---|---|---|---|---|
T5_large | 53.67 | 42.26 | 72.25 | Investigating Pretrained Language Models for Graph-to-Text Generation | |
BART_large | 43.97 | 38.61 | 66.53 | Investigating Pretrained Language Models for Graph-to-Text Generation |
0 of 2 row(s) selected.