Model card
LightGBM.
Microsoftopen-sourceGradient Boosted Trees (leaf-wise)
Fast gradient boosting with leaf-wise tree growth. State-of-the-art for many tabular tasks.
§ 01 · Benchmarks
Every benchmark LightGBM has a recorded score for.
| # | Benchmark | Area · Task | Metric | Value | Rank | Date | Source |
|---|---|---|---|---|---|---|---|
| 01 | California Housing | Time Series · Tabular Regression | rmse | 0.4% | #2 | — | source ↗ |
| 02 | OpenML-CC18 | Time Series · Tabular Classification | accuracy | 86.9% | #3 | 2025-06-01 | source ↗ |
Rank column shows this model’s position vs all other models scored on the same benchmark + metric (competitors after the slash). #1 in red means current SOTA. Sorted by rank, then newest result.
§ 03 · Papers
1 paper with results for LightGBM.
- 2025-06-01· 1 result
ConTextTab: A Semantics-Aware Tabular In-Context Learner
Marco Spinaci
§ 04 · Related models
Other Microsoft models scored on Codesota.
§ 05 · Sources & freshness
Where these numbers come from.
LightGBM scikit-learn benchmark
1
result
ConTextTab Table 1
1
result
2 of 2 rows marked verified.