MTOB (Machine Translation from One Book) is a benchmark for learning to translate between English and Kalamang (an extremely low-resource language with <200 speakers) using a single field-linguistics grammar book and related reference materials (word lists, example sentence pairs, and grammar excerpts). It was introduced by Tanzer et al. (A Benchmark for Learning to Translate a New Language from One Grammar Book; arXiv:2309.16575 / ICLR 2024). The benchmark frames translation as learning a new language from human-readable grammar materials (rather than large mined corpora) and evaluates model performance on EnglishKalamang translation. The original paper reports automatic evaluation (e.g., chrF) for kgveng; the dataset has been used in evaluations under different context settings (no-context, half-book, full-book) as noted in follow-up evaluations (your extraction indicates BLEURT was reported for the KalamangEnglish direction under no-context, half-book and full-book settings). Code and data are available from the authors' repository (https://github.com/lukemelas/mtob) and there is a Hugging Face dataset entry.
No results indexed yet — be the first to submit a score.
Submit a checkpoint and a reproduction script. We will run it, publish the score, and — if it takes the top — annotate the step on the progress chart with your name.