Code Documentation Generation On 4
评估指标
Smoothed BLEU-4
评测结果
各个模型在此基准测试上的表现结果
模型名称 | Smoothed BLEU-4 | Paper Title | Repository |
---|---|---|---|
CodeBERT (MLM+RTD) | 8.46 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
RoBERTa | 7.26 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
CodeBERT (MLM) | 7.95 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
Transformer | 7.87 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
seq2seq | 6.96 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages | |
CodeTrans-MT-Base | 15.26 | CodeTrans: Towards Cracking the Language of Silicon's Code Through Self-Supervised Deep Learning and High Performance Computing | |
pre-train w/ code only | 7.36 | CodeBERT: A Pre-Trained Model for Programming and Natural Languages |
0 of 7 row(s) selected.