Code Documentation Generation On 3
Métriques
Smoothed BLEU-4
Résultats
Résultats de performance de divers modèles sur ce benchmark
Nom du modèle | Smoothed BLEU-4 | Paper Title | Repository |
---|---|---|---|
pre-train w/ code only | 20.71 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
CodeTrans-MT-Base | 26.23 | CodeTrans: Towards Cracking the Language of Silicon's Code Through Self-Supervised Deep Learning and High Performance Computing | |
CodeBERT (RTD) | 20.25 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
CodeBERT (MLM+RTD) | 21.32 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
seq2seq | 18.4 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
RoBERTa | 19.9 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
Transformer | 18.25 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
CodeBERT (MLM) | 21 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
0 of 8 row(s) selected.