HyperAIHyperAI
17 days ago

How Features Benefit: Parallel Series Embedding for Multivariate Time Series Forecasting with Transformer

{Zonglin Lyu, Xuande Feng}
Abstract

Forecasting time series is an engaging and vitalmathematical topic. Theories and applications in related fieldshave been studied for decades, and deep learning has providedreliable tools in recent years. Transformer, capable to capturelonger sequence dependencies, was exploited as a powerful architecture in time series forecasting. While existing work majorlycontributed to breaking memory bottleneck of Trasnformer, howto effectively leverage multivariate time series remains barelyfocused. In this work, a novel architecture utilizing a primaryTransformer is proposed to conduct multivariate time seriespredictions. Our proposed architecture has two main advantages. Firstly, it accurately predicts multivariate time series withshorter or longer sequence lengths and steps. We benchmarkour proposed model with various baseline architectures on realworld datasets, and our model improved their performancessignificantly. Secondly, it can easily be leveraged in Transformerbased variants,