Table To Text Generation On Webnlg Seen
Metrics
BLEU
METEOR
TER
Results
Performance results of various models on this benchmark
Comparison Table
Model Name | BLEU | METEOR | TER |
---|---|---|---|
htlm-hyper-text-pre-training-and-prompting-of | 65.4 | 0.46 | 0.33 |
htlm-hyper-text-pre-training-and-prompting-of | 65.3 | 0.46 | 0.33 |