基于Transformer的低复杂度时间序列预测模型
首发时间:2023-04-21
摘要:时间序列预测是根据历史时序数据去预测未来数据,在零售、金融、电力系统等领域都有着重要作用,进行精确的时间序列预测有着重要的意义。现有的深度学习模型如RNN、LSTM等在时间序列预测领域取得了很多成果,但这些模型因为其结构特点普遍存在着难以捕捉时序数据长依赖特征,长时间序列预测精度较差的缺点。Transformer采用了自注意力机制,具有并行处理,对长依赖特征敏感的特点,但Transformer因为其二次复杂度问题限制了其在长时间序列预测问题上的应用空间。针对Transformer的二次复杂度问题,我们提出了Max Self-attention机制,其相比Self-attention具有更低的时间复杂度,在长时间序列预测问题上速度更快并且拥有更高的预测精度。
For information in English, please click here
Transformer-based low-complexity time series forecasting model
Abstract:Time series forecasting is to predict future data based on historical time series data, which has an important role in retail, finance, power system and other fields, and it is of great significance to make accurate time series forecasting. Existing deep learning models such as RNN and LSTM have achieved many results in the field of time series forecasting, but these models generally have the disadvantage of poor accuracy of long time series prediction because of their structural features that it is difficult to capture the long-dependent features of time series data. transformer adopts the self-attentive mechanism, which has the characteristics of parallel processing and sensitive to long-dependent features, but Transformer limits its application space on long time series forecastingproblems because of its quadratic complexity problem. To address the quadratic complexity problem of Transformer, we propose the Max Self-attention mechanism, which has a lower time complexity than Self-attention, is faster and has higher prediction accuracy in long time seriesforecasting problems.
Keywords: time series forecasting self-attentive mechanisms numerical prediction deep learning
基金:
引用
No.****
同行评议
勘误表
基于Transformer的低复杂度时间序列预测模型
评论
全部评论