Code Documentation Generation On 1
评估指标
Smoothed BLEU-4
评测结果
各个模型在此基准测试上的表现结果
模型名称 | Smoothed BLEU-4 | Paper Title | Repository |
---|---|---|---|
pre-train w/ code only | 13.07 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
CodeBERT (MLM+RTD) | 14.56 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
RoBERTa | 13.2 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
Transformer | 12.57 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
CodeBERT (MLM) | 13.59 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
CodeTrans-MT-Large | 21.87 | CodeTrans: Towards Cracking the Language of Silicon's Code Through Self-Supervised Deep Learning and High Performance Computing | |
CodeBERT (RTD) | 12.72 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
seq2seq | 11.42 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
0 of 8 row(s) selected.