HyperAI초신경
홈
뉴스
최신 연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
홈
SOTA
Time Series Forecasting
Time Series Forecasting On Etth2 96 1
Time Series Forecasting On Etth2 96 1
평가 지표
MSE
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
MSE
Paper Title
Repository
MoLE-RLinear
0.273
Mixture-of-Linear-Experts for Long-term Time Series Forecasting
DiPE-Linear
0.275
Disentangled Interpretable Representation for Efficient Long-term Time Series Forecasting
PRformer
0.268
PRformer: Pyramidal Recurrent Transformer for Multivariate Time Series Forecasting
NLinear
0.277
Are Transformers Effective for Time Series Forecasting?
MoLE-DLinear
0.287
Mixture-of-Linear-Experts for Long-term Time Series Forecasting
LTBoost (drop_last=false)
0.263
LTBoost: Boosted Hybrids of Ensemble Linear and Gradient Algorithms for the Long-term Time Series Forecasting
PatchMixer
0.225
PatchMixer: A Patch-Mixing Architecture for Long-Term Time Series Forecasting
SegRNN
0.263
SegRNN: Segment Recurrent Neural Network for Long-Term Time Series Forecasting
xPatch
0.226
xPatch: Dual-Stream Time Series Forecasting with Exponential Seasonal-Trend Decomposition
TiDE
0.27
Long-term Forecasting with TiDE: Time-series Dense Encoder
FiLM
0.284
FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting
TSMixer
0.276
TSMixer: Lightweight MLP-Mixer Model for Multivariate Time Series Forecasting
-
RLinear
0.262
Revisiting Long-term Time Series Forecasting: An Investigation on Linear Mapping
TEFN
0.288
Time Evidence Fusion Network: Multi-source View in Long-Term Time Series Forecasting
-
DLinear
0.289
Are Transformers Effective for Time Series Forecasting?
PatchTST/64
0.274
A Time Series is Worth 64 Words: Long-term Forecasting with Transformers
0 of 16 row(s) selected.
Previous
Next