Kg To Text Generation On Webnlg Seen
Metrics
BLEU
METEOR
chrF++
Results
Performance results of various models on this benchmark
Model Name | BLEU | METEOR | chrF++ | Paper Title | Repository |
---|---|---|---|---|---|
BART_large | 63.45 | 45.49 | 77.57 | Investigating Pretrained Language Models for Graph-to-Text Generation | |
T5_large | 64.71 | 45.85 | 78.29 | Investigating Pretrained Language Models for Graph-to-Text Generation |
0 of 2 row(s) selected.