Codesota · Models · PatchTSTIBM Research5 results · 2 benchmarks
Model card

PatchTST.

IBM Researchopen-sourceTransformer (patch tokenization)

Segments time series into patches as tokens for efficient transformer attention.

§ 01 · Benchmarks

Every benchmark PatchTST has a recorded score for.

#BenchmarkArea · TaskMetricValueRankDateSource
01M4 CompetitionTime Series · Time Series Forecastingmase1.9%#2/132022-11-27source ↗
02WeatherTime Series · Time Series Forecastingmse0.3%#3/62024-05-07source ↗
03M4 CompetitionTime Series · Time Series Forecastingowa1.0%#3/132022-11-27source ↗
04M4 CompetitionTime Series · Time Series Forecastingsmapi13.2%#3/132022-11-27source ↗
05WeatherTime Series · Time Series Forecastingmae0.3%#4/62024-05-07source ↗
Rank column shows this model’s position vs all other models scored on the same benchmark + metric (competitors after the slash). #1 in red means current SOTA. Sorted by rank, then newest result.
§ 02 · Strengths by area

Where PatchTST actually performs.

Time Series
2
benchmarks
avg rank #3.0
§ 03 · Papers

2 papers with results for PatchTST.

  1. 2024-05-07· Time Series· 2 results

    iTransformer: Inverted Transformers Are Effective for Time Series Forecasting

    Yong Liu, Tengge Hu, Haoran Zhang, Ling Jin et al.
  2. 2022-11-27· Time Series· 3 results

    A Time Series is Worth 64 Words: Long-term Forecasting with Transformers

    Yuqi Nie, Nam H. Nguyen, Phanwadee Sinthong, Jayant Kalagnanam
§ 04 · Related models

Other IBM Research models scored on Codesota.

Docling
Unknown params · 0 results
§ 05 · Sources & freshness

Where these numbers come from.

TimeMixer++ Table 2
3
results
iTransformer Table 1
2
results
5 of 5 rows marked verified. · first result 2022-11-27, latest 2024-05-07.