Time Series Forecasting via Learning Convolutionally Low-Rank Models

被引:10
作者
Liu, Guangcan [1 ]
机构
[1] Southeast Univ, Sch Automat, Nanjing 210018, Peoples R China
关键词
Convolution; Time series analysis; Forecasting; Sparse matrices; Predictive models; Market research; Discrete Fourier transforms; Compressed sensing; sparsity and low-rankness; dictionary learning; time series forecasting; model combination; Fourier transform; coherence; OVERCOMPLETE DICTIONARIES;
D O I
10.1109/TIT.2022.3144605
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, Liu and Zhang studied the rather challenging problem of time series forecasting from the perspective of compressed sensing. They proposed a no-learning method, named Convolution Nuclear Norm Minimization (CNNM), and proved that CNNM can exactly recover the future part of a series from its observed part, provided that the series is convolutionally low-rank. While impressive, the convolutional low-rankness condition may not be satisfied whenever the series is far from being seasonal, and is in fact brittle to the presence of trends and dynamics. This paper tries to approach the issues by integrating a learnable, orthonormal transformation into CNNM, with the purpose for converting the series of involute structures into regular signals of convolutionally low-rank. We prove that the resultant model, termed Learning-Based CNNM (LbCNNM), strictly succeeds in identifying the future part of a series, as long as the transform of the series is convolutionally low-rank. To learn proper transformations that may meet the required success conditions, we devise an interpretable method based on Principal Component Pursuit (PCP). Equipped with this learning method and some elaborate data argumentation skills, LbCNNM not only can handle well the major components of time series (including trends, seasonality and dynamics), but also can make use of the forecasts provided by some other forecasting methods; this means LbCNNM can be used as a general tool for model combination. Extensive experiments on 100,452 real-world time series from Time Series Data Library (TSDL) and M4 Competition (M4) demonstrate the superior performance of LbCNNM.
引用
收藏
页码:3362 / 3380
页数:19
相关论文
共 38 条
[1]   On the uniqueness of overcomplete dictionaries, and a practical way to retrieve them [J].
Aharon, Michal ;
Elad, Michael ;
Bruckstein, Alfred M. .
LINEAR ALGEBRA AND ITS APPLICATIONS, 2006, 416 (01) :48-67
[2]   K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation [J].
Aharon, Michal ;
Elad, Michael ;
Bruckstein, Alfred .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2006, 54 (11) :4311-4322
[3]  
Bruder B., 2011, SSRN, V1, P1
[4]  
Candès EJ, 2008, IEEE SIGNAL PROC MAG, V25, P21, DOI 10.1109/MSP.2007.914731
[5]   Robust Principal Component Analysis? [J].
Candes, Emmanuel J. ;
Li, Xiaodong ;
Ma, Yi ;
Wright, John .
JOURNAL OF THE ACM, 2011, 58 (03)
[6]   Matrix Completion With Noise [J].
Candes, Emmanuel J. ;
Plan, Yaniv .
PROCEEDINGS OF THE IEEE, 2010, 98 (06) :925-936
[7]   Exact Matrix Completion via Convex Optimization [J].
Candes, Emmanuel J. ;
Recht, Benjamin .
FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2009, 9 (06) :717-772
[8]   Recurrent Neural Networks for Multivariate Time Series with Missing Values [J].
Che, Zhengping ;
Purushotham, Sanjay ;
Cho, Kyunghyun ;
Sontag, David ;
Liu, Yan .
SCIENTIFIC REPORTS, 2018, 8
[9]   Incoherence-Optimal Matrix Completion [J].
Chen, Yudong .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2015, 61 (05) :2909-2923
[10]   Block-circulant matrices with circulant blocks, Weil sums, and mutually unbiased bases. II. The prime power case [J].
Combescure, Monique .
JOURNAL OF MATHEMATICAL PHYSICS, 2009, 50 (03)