HyperAI초신경
홈
뉴스
최신 연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
홈
SOTA
시계열 예측
Time Series Forecasting On Etth1 96 1
Time Series Forecasting On Etth1 96 1
평가 지표
MAE
MSE
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
MAE
MSE
Paper Title
Repository
xPatch
0.379
0.354
xPatch: Dual-Stream Time Series Forecasting with Exponential Seasonal-Trend Decomposition
FiLM
0.394
0.371
FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting
TiDE
0.398
0.375
Long-term Forecasting with TiDE: Time-series Dense Encoder
DiPE-Linear
-
0.369
Disentangled Interpretable Representation for Efficient Long-term Time Series Forecasting
MoLE-DLinear
-
0.377
Mixture-of-Linear-Experts for Long-term Time Series Forecasting
LTBoost (drop_last=false)
0.382
0.357
LTBoost: Boosted Hybrids of Ensemble Linear and Gradient Algorithms for the Long-term Time Series Forecasting
RLinear
0.391
0.366
Revisiting Long-term Time Series Forecasting: An Investigation on Linear Mapping
TTM
-
0.36
Tiny Time Mixers (TTMs): Fast Pre-trained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series
TSMixer
0.398
0.368
TSMixer: Lightweight MLP-Mixer Model for Multivariate Time Series Forecasting
-
TEFN
0.391
0.383
Time Evidence Fusion Network: Multi-source View in Long-Term Time Series Forecasting
-
SegRNN
0.376
0.341
SegRNN: Segment Recurrent Neural Network for Long-Term Time Series Forecasting
PatchMixer
0.381
0.353
PatchMixer: A Patch-Mixing Architecture for Long-Term Time Series Forecasting
MoLE-RLinear
-
0.375
Mixture-of-Linear-Experts for Long-term Time Series Forecasting
PRformer
-
0.354
PRformer: Pyramidal Recurrent Transformer for Multivariate Time Series Forecasting
PatchTST/64
0.4
0.37
A Time Series is Worth 64 Words: Long-term Forecasting with Transformers
0 of 15 row(s) selected.
Previous
Next