Code Documentation Generation On 6
Metriken
Smoothed BLEU-4
Ergebnisse
Leistungsergebnisse verschiedener Modelle zu diesem Benchmark
| Paper Title | ||
|---|---|---|
| CodeBERT (MLM+RTD) | 15.99 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
| CodeBERT (MLM) | 15.55 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
| pre-train w/ code only | 15.15 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
| CodeBERT (RTD) | 15.03 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
| RoBERTa | 14.52 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
| Transformer | 14.31 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
| seq2seq | 13.36 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
0 of 7 row(s) selected.