Source Code Summarization On Parallelcorpus
Metrics
BLEU-4
METEOR
Results
Performance results of various models on this benchmark
Model Name | BLEU-4 | METEOR | Paper Title | Repository |
---|---|---|---|---|
AdaMo-basic | 33.85 | 21.68% | Assemble Foundation Models for Automatic Code Summarization | |
AdaMo-noise | 34.05 | 21.92% | Assemble Foundation Models for Automatic Code Summarization |
0 of 2 row(s) selected.