Dataset from Papers With Code
7 results indexed across 1 metric. Shaded row marks current SOTA; ties broken by submission date.
| # | Model | Org | Submitted | Paper / code | smoothed-bleu-4 |
|---|---|---|---|---|---|
| 01 | CodeTrans-MT-Base | — | Apr 2021 | CodeTrans: Towards Cracking the Language of Silicon's Co… · code | 20.39 |
| 02 | CodeBERT (MLM) | — | Feb 2020 | CodeBERT: A Pre-Trained Model for Programming and Natura… · code | 15.48 |
| 03 | CodeBERT (MLM+RTD) | — | Feb 2020 | CodeBERT: A Pre-Trained Model for Programming and Natura… · code | 15.41 |
| 04 | pre-train w/ code only | — | Feb 2020 | CodeBERT: A Pre-Trained Model for Programming and Natura… · code | 15.12 |
| 05 | RoBERTa | — | Feb 2020 | CodeBERT: A Pre-Trained Model for Programming and Natura… · code | 14.92 |
| 06 | Transformer | — | Feb 2020 | CodeBERT: A Pre-Trained Model for Programming and Natura… · code | 13.44 |
| 07 | seq2seq | — | Feb 2020 | CodeBERT: A Pre-Trained Model for Programming and Natura… · code | 13.04 |
Every paper below corresponds to at least one row in the leaderboard above. Click through for the arXiv preprint and, when available, the reference implementation.
Submit a checkpoint and a reproduction script. We will run it, publish the score, and — if it takes the top — annotate the step on the progress chart with your name.