HyperAIHyperAI超神经
首页资讯论文教程数据集百科SOTALLM 模型天梯GPU 天梯顶会
全站搜索
关于
中文
HyperAIHyperAI超神经
  1. 首页
  2. SOTA
  3. 时间序列预测
  4. Time Series Forecasting On Etth2 720 2

Time Series Forecasting On Etth2 720 2

评估指标

MAE
MSE

评测结果

各个模型在此基准测试上的表现结果

模型名称
MAE
MSE
Paper TitleRepository
Transformer0.4340.2853Long-term series forecasting with Query Selector -- efficient model of sparse attention
PatchMixer0.3740.217PatchMixer: A Patch-Mixing Architecture for Long-Term Time Series Forecasting
FiLM0.3960.241FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting
Informer0.3380.181Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting
PatchTST/640.380.223A Time Series is Worth 64 Words: Long-term Forecasting with Transformers
SegRNN0.3650.205SegRNN: Segment Recurrent Neural Network for Long-Term Time Series Forecasting
NLinear0.3810.225Are Transformers Effective for Time Series Forecasting?
SCINet0.4290.286SCINet: Time Series Modeling and Forecasting with Sample Convolution and Interaction
AutoCon0.3440.177Self-Supervised Contrastive Learning for Long-term Forecasting
QuerySelector0.4130.2585Long-term series forecasting with Query Selector -- efficient model of sparse attention
DLinear0.4260.276Are Transformers Effective for Time Series Forecasting?
0 of 11 row(s) selected.
HyperAI

学习、理解、实践,与社区一起构建人工智能的未来

中文

关于

关于我们数据集帮助

产品

资讯教程数据集百科

链接

TVM 中文Apache TVMOpenBayes

© HyperAI超神经

津ICP备17010941号-1京公网安备11010502038810号京公网安备11010502038810号
TwitterBilibili