Time Series Forecasting On Etth1 336 1
المقاييس
MAE
MSE
النتائج
نتائج أداء النماذج المختلفة على هذا المعيار القياسي
جدول المقارنة
اسم النموذج | MAE | MSE |
---|---|---|
convtimenet-a-deep-hierarchical-fully | 0.420 | 0.405 |
sparsetsf-modeling-long-term-time-series | - | 0.434 |
generative-pretrained-hierarchical | 0.423 | 0.430 |
revisiting-long-term-time-series-forecasting | 0.423 | 0.42 |
time-series-is-a-special-sequence-forecasting | 0.495 | 0.504 |
rose-register-assisted-general-time-series | 0.422 | 0.406 |
taming-pre-trained-language-models-with-n | 0.436 | 0.456 |
only-the-curve-shape-matters-training | 0.419 | 0.424 |
autotimes-autoregressive-time-series | 0.429 | 0.401 |
a-time-series-is-worth-64-words-long-term | 0.44 | 0.422 |
only-the-curve-shape-matters-training | 0.427 | 0.459 |
taming-pre-trained-llms-for-generalised-time | 0.436 | 0.456 |
units-building-a-unified-time-series-model | 0.422 | 0.405 |
tempo-prompt-based-generative-pre-trained | 0.425 | 0.408 |
patchmixer-a-patch-mixing-architecture-for | 0.414 | 0.392 |
winnet-time-series-forecasting-with-a-window | 0.426 | 0.419 |
leveraging-2d-information-for-long-term-time | 0.440 | 0.436 |
pathformer-multi-scale-transformers-with | 0.432 | 0.454 |
forecastgrapher-redefining-multivariate-time | 0.448 | 0.472 |
only-the-curve-shape-matters-training | 0.444 | 0.475 |
unitime-a-language-empowered-unified-model | 0.407 | 0.398 |
film-frequency-improved-legendre-memory-model | 0.445 | 0.442 |
an-analysis-of-linear-time-series-forecasting | - | 0.448 |
d-pad-deep-shallow-multi-frequency-patterns | 0.406 | 0.374 |
timemachine-a-time-series-is-worth-4-mambas | 0.421 | 0.429 |
informer-beyond-efficient-transformer-for | 0.753 | 0.884 |
fredformer-frequency-debiased-transformer-for | 0.403 | 0.395 |
generative-pretrained-hierarchical | 0.432 | 0.456 |
unified-training-of-universal-time-series | 0.429 | 0.412 |
tsmixer-an-all-mlp-architecture-for-time | 0.431 | - |
rethinking-channel-dependence-for | 0.453 | 0.453 |
adaptive-multi-scale-decomposition-framework | 0.427 | 0.418 |
xpatch-dual-stream-time-series-forecasting | 0.415 | 0.391 |
rethinking-channel-dependence-for | 0.435 | 0.433 |
basisformer-attention-based-time-series-1 | 0.451 | 0.473 |
unlocking-the-potential-of-transformers-in | 0.425 | 0.423 |
attention-as-an-rnn | 0.55 | 0.65 |
segrnn-segment-recurrent-neural-network-for | 0.417 | 0.401 |
a-decoder-only-foundation-model-for-time | 0.436 | - |
fits-modeling-time-series-with-10k-parameters | - | 0.427 |
disentangled-interpretable-representation-for | - | 0.424 |
softs-efficient-multivariate-time-series | 0.452 | 0.480 |
unified-training-of-universal-time-series | 0.450 | 0.456 |
is-mamba-effective-for-time-series | 0.468 | 0.489 |
only-the-curve-shape-matters-training | 0.432 | 0.468 |
timecma-towards-llm-empowered-time-series | 0.405 | 0.403 |
long-term-series-forecasting-with-query | 0.7041 | 0.8321 |
atfnet-adaptive-time-frequency-ensembled | 0.521 | 0.514 |
prformer-pyramidal-recurrent-transformer-for | - | 0.427 |
tsmixer-lightweight-mlp-mixer-model-for | 0.436 | 0.421 |
himtm-hierarchical-multi-scale-masked-time | 0.430 | 0.422 |
boosting-mlps-with-a-coarsening-strategy-for | 0.450 | 0.479 |
autoformer-decomposition-transformers-with | 0.484 | 0.505 |
mixture-of-linear-experts-for-long-term-time | - | 0.469 |
long-term-series-forecasting-with-query | 0.7039 | 0.8503 |
minusformer-improving-time-series-forecasting | 0.446 | 0.465 |
itransformer-inverted-transformers-are | 0.458 | 0.487 |
ltboost-boosted-hybrids-of-ensemble-linear | 0.423 | 0.424 |
long-term-forecasting-with-tide-time-series | 0.433 | 0.435 |
cats-enhancing-multivariate-time-series | 0.437 | 0.423 |
time-evidence-fusion-network-multi-source | 0.441 | 0.475 |
mixture-of-linear-experts-for-long-term-time | - | 0.43 |
only-the-curve-shape-matters-training | 0.418 | 0.433 |
only-the-curve-shape-matters-training | 0.436 | 0.466 |
mamba-360-survey-of-state-space-models-as | 0.443 | 0.473 |
random-projection-layers-for-multidimensional | 0.498 | 0.521 |
vcformer-variable-correlation-transformer | 0.449 | 0.473 |
bi-mamba4ts-bidirectional-mamba-for-time | 0.445 | 0.455 |
unified-training-of-universal-time-series | 0.474 | 0.514 |
deformtime-capturing-variable-dependencies | 0.2158 | - |
are-transformers-effective-for-time-series | 0.427 | 0.429 |
are-transformers-effective-for-time-series | 0.443 | 0.439 |