Codesota · Models · iTransformerTHUML5 results · 2 benchmarks
Model card

iTransformer.

THUMLopen-sourceTransformer (inverted attention)

Inverts the attention mechanism to model multivariate correlations across channels rather than time steps.

§ 01 · Benchmarks

Every benchmark iTransformer has a recorded score for.

#BenchmarkArea · TaskMetricValueRankDateSource
01WeatherTime Series · Time Series Forecastingmae0.3%#5/62024-05-07source ↗
02WeatherTime Series · Time Series Forecastingmse0.3%#5/62024-05-07source ↗
03M4 CompetitionTime Series · Time Series Forecastingmase1.8%#5/132023-10-10source ↗
04M4 CompetitionTime Series · Time Series Forecastingowa0.9%#5/132023-10-10source ↗
05M4 CompetitionTime Series · Time Series Forecastingsmapi12.7%#6/132023-10-10source ↗
Rank column shows this model’s position vs all other models scored on the same benchmark + metric (competitors after the slash). #1 in red means current SOTA. Sorted by rank, then newest result.
§ 02 · Strengths by area

Where iTransformer actually performs.

Time Series
2
benchmarks
avg rank #5.2
§ 03 · Papers

2 papers with results for iTransformer.

  1. 2024-05-07· Time Series· 2 results

    iTransformer: Inverted Transformers Are Effective for Time Series Forecasting

    Yong Liu, Tengge Hu, Haoran Zhang, Ling Jin et al.
  2. 2023-10-10· Time Series· 3 results

    iTransformer: Inverted Transformers Are Effective for Time Series Forecasting

    Yong Liu, Tengge Hu, Haoran Zhang, Ling Jin et al.
§ 04 · Related models

Other THUML models scored on Codesota.

Autoformer
Unknown params · 0 results
DLinear
0 results
FEDformer
Unknown params · 0 results
TimesNet
0 results
§ 05 · Sources & freshness

Where these numbers come from.

TimeMixer++ Table 2
3
results
iTransformer Table 1
2
results
5 of 5 rows marked verified. · first result 2023-10-10, latest 2024-05-07.