Code Documentation Generation On
Metrics
Smoothed BLEU-4
Results
Performance results of various models on this benchmark
| Paper Title | ||
|---|---|---|
| CodeTrans-MT-Base | 20.39 | CodeTrans: Towards Cracking the Language of Silicon's Code Through Self-Supervised Deep Learning and High Performance Computing |
| CodeBERT (MLM) | 15.48 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
| CodeBERT (MLM+RTD) | 15.41 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
| pre-train w/ code only | 15.12 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
| RoBERTa | 14.92 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
| Transformer | 13.44 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
| seq2seq | 13.04 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
0 of 7 row(s) selected.