HyperAI超神経
ホーム
ニュース
最新論文
チュートリアル
データセット
百科事典
SOTA
LLMモデル
GPU ランキング
学会
検索
サイトについて
日本語
HyperAI超神経
Toggle sidebar
サイトを検索…
⌘
K
ホーム
SOTA
Time Series Forecasting
Time Series Forecasting On Etth2 720 2
Time Series Forecasting On Etth2 720 2
評価指標
MAE
MSE
評価結果
このベンチマークにおける各モデルのパフォーマンス結果
Columns
モデル名
MAE
MSE
Paper Title
Repository
Transformer
0.434
0.2853
Long-term series forecasting with Query Selector -- efficient model of sparse attention
PatchMixer
0.374
0.217
PatchMixer: A Patch-Mixing Architecture for Long-Term Time Series Forecasting
FiLM
0.396
0.241
FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting
Informer
0.338
0.181
Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting
PatchTST/64
0.38
0.223
A Time Series is Worth 64 Words: Long-term Forecasting with Transformers
SegRNN
0.365
0.205
SegRNN: Segment Recurrent Neural Network for Long-Term Time Series Forecasting
NLinear
0.381
0.225
Are Transformers Effective for Time Series Forecasting?
SCINet
0.429
0.286
SCINet: Time Series Modeling and Forecasting with Sample Convolution and Interaction
AutoCon
0.344
0.177
Self-Supervised Contrastive Learning for Long-term Forecasting
QuerySelector
0.413
0.2585
Long-term series forecasting with Query Selector -- efficient model of sparse attention
DLinear
0.426
0.276
Are Transformers Effective for Time Series Forecasting?
0 of 11 row(s) selected.
Previous
Next