The RDCL2019 test set from the ICDAR 2019 Competition on Recognition of Documents with Complex Layouts. Comprises 85 scanned page images from contemporary magazines and technical/scientific publications (PRImA Layout Analysis Dataset). Evaluation measures region segmentation and classification using Weighted F1-score across layout classes. A continuous competition allowing post-2019 submissions via the Aletheia evaluation tool.
Competition benchmark from 2019. Top score (Overall F1 ~0.92) held by fglihai and USYD NLP_CS29-2. Research community moved to DocLayNet, D4LA, and DocStructBench for newer benchmarking. No new papers report on this specific test set in 2024-2025.
18 results indexed across 6 metrics. Shaded row marks current SOTA; ties broken by submission date.
| # | Model | Org | Submitted | Paper / code | figure |
|---|---|---|---|---|---|
| 01 | fglihai | — | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.970 |
| 02 | USYD NLP_CS29-2 | — | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.960 |
| 03 | Faster R-CNNOSS | Microsoft Research | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.950 |
| # | Model | Org | Submitted | Paper / code | list |
|---|---|---|---|---|---|
| 01 | fglihai | — | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.900 |
| 02 | USYD NLP_CS29-2 | — | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.900 |
| 03 | Faster R-CNNOSS | Microsoft Research | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.890 |
| # | Model | Org | Submitted | Paper / code | overall |
|---|---|---|---|---|---|
| 01 | USYD NLP_CS29-2 | — | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.920 |
| 02 | fglihai | — | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.920 |
| 03 | Faster R-CNNOSS | Microsoft Research | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.910 |
| # | Model | Org | Submitted | Paper / code | table |
|---|---|---|---|---|---|
| 01 | USYD NLP_CS29-2 | — | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.960 |
| 02 | fglihai | — | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.960 |
| 03 | Faster R-CNNOSS | Microsoft Research | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.950 |
| # | Model | Org | Submitted | Paper / code | text |
|---|---|---|---|---|---|
| 01 | fglihai | — | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.930 |
| 02 | USYD NLP_CS29-2 | — | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.930 |
| 03 | Faster R-CNNOSS | Microsoft Research | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.920 |
| # | Model | Org | Submitted | Paper / code | title |
|---|---|---|---|---|---|
| 01 | USYD NLP_CS29-2 | — | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.840 |
| 02 | fglihai | — | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.840 |
| 03 | Faster R-CNNOSS | Microsoft Research | Sep 2019 | ICDAR 2019 RDCL2019 Competition | 0.820 |
Submit a checkpoint and a reproduction script. We will run it, publish the score, and — if it takes the top — annotate the step on the progress chart with your name.