HyperAI
Home
News
Latest Papers
Tutorials
Datasets
Wiki
SOTA
LLM Models
GPU Leaderboard
Events
Search
About
English
HyperAI
Toggle sidebar
Search the site…
⌘
K
Home
SOTA
Time Series Forecasting
Time Series Forecasting On Etth2 720 2
Time Series Forecasting On Etth2 720 2
Metrics
MAE
MSE
Results
Performance results of various models on this benchmark
Columns
Model Name
MAE
MSE
Paper Title
Repository
Transformer
0.434
0.2853
Long-term series forecasting with Query Selector -- efficient model of sparse attention
PatchMixer
0.374
0.217
PatchMixer: A Patch-Mixing Architecture for Long-Term Time Series Forecasting
FiLM
0.396
0.241
FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting
Informer
0.338
0.181
Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting
PatchTST/64
0.38
0.223
A Time Series is Worth 64 Words: Long-term Forecasting with Transformers
SegRNN
0.365
0.205
SegRNN: Segment Recurrent Neural Network for Long-Term Time Series Forecasting
NLinear
0.381
0.225
Are Transformers Effective for Time Series Forecasting?
SCINet
0.429
0.286
SCINet: Time Series Modeling and Forecasting with Sample Convolution and Interaction
AutoCon
0.344
0.177
Self-Supervised Contrastive Learning for Long-term Forecasting
QuerySelector
0.413
0.2585
Long-term series forecasting with Query Selector -- efficient model of sparse attention
DLinear
0.426
0.276
Are Transformers Effective for Time Series Forecasting?
0 of 11 row(s) selected.
Previous
Next