A Survey on Time-Series Pre-Trained Models

被引:8
作者
Ma, Qianli [1 ]
Liu, Zhen [1 ]
Zheng, Zhenjing [1 ]
Huang, Ziyang [1 ]
Zhu, Siying [1 ]
Yu, Zhongzhong [1 ]
Kwok, James T. [2 ]
机构
[1] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
[2] Hong Kong Univ Sci & Technol, Dept Comp Sci & Engn, Hong Kong 999077, Peoples R China
基金
中国国家自然科学基金;
关键词
Time series analysis; Data models; Deep learning; Transformers; Forecasting; Surveys; Transfer learning; Convolutional neural networks; Computational modeling; Predictive models; Time-series mining; pre-trained models; deep learning; transfer learning; transformer; MULTIARMED BANDIT; NUCLEAR NORM;
D O I
10.1109/TKDE.2024.3475809
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Time-Series Mining (TSM) is an important research area since it shows great potential in practical applications. Deep learning models that rely on massive labeled data have been utilized for TSM successfully. However, constructing a large-scale well-labeled dataset is difficult due to data annotation costs. Recently, pre-trained models have gradually attracted attention in the time series domain due to their remarkable performance in computer vision and natural language processing. In this survey, we provide a comprehensive review of Time-Series Pre-Trained Models (TS-PTMs), aiming to guide the understanding, applying, and studying TS-PTMs. Specifically, we first briefly introduce the typical deep learning models employed in TSM. Then, we give an overview of TS-PTMs according to the pre-training techniques. The main categories we explore include supervised, unsupervised, and self-supervised TS-PTMs. Further, extensive experiments involving 27 methods, 434 datasets, and 679 transfer learning scenarios are conducted to analyze the advantages and disadvantages of transfer learning strategies, Transformer-based models, and representative TS-PTMs. Finally, we point out some potential directions of TS-PTMs for future work.
引用
收藏
页码:7536 / 7555
页数:20
相关论文
共 62 条
[41]  
Murali V. S., 2021, bioRxiv
[42]  
Pawlowski Colin, 2019, Machine learning for problems with missing and uncertain data with applications to personalized medicine
[43]   No DBA? No Regret! Multi-Armed Bandits for Index Tuning of Analytical and HTAP Workloads With Provable Guarantees [J].
Perera, R. Malinga ;
Oetomo, Bastian ;
Rubinstein, Benjamin I. P. ;
Borovica-Gajic, Renata .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (12) :12855-12872
[44]   Robust low-rank tensor completion via transformed tensor nuclear norm with total variation regularization [J].
Qiu, Duo ;
Bai, Minru ;
Ng, Michael K. ;
Zhang, Xiongjun .
NEUROCOMPUTING, 2021, 435 :197-215
[45]   Noisy Tensor Completion via Low-Rank Tensor Ring [J].
Qiu, Yuning ;
Zhou, Guoxu ;
Zhao, Qibin ;
Xie, Shengli .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (01) :1127-1141
[46]  
Raskutti G, 2010, J MACH LEARN RES, V11, P2241
[47]  
Ravikumar P., 2012, A unified framework for high-dimensional analysis of M-estimators with decomposable regularizers, V27, P538
[48]  
Rizk G, 2021, PR MACH LEARN RES, V139
[49]  
Shi Chengshuai, 2023, 2023 IEEE International Symposium on Information Theory (ISIT), P1460, DOI 10.1109/ISIT54713.2023.10206552
[50]   Robust tensor completion using transformed tensor singular value decomposition [J].
Song, Guangjing ;
Ng, Michael K. ;
Zhang, Xiongjun .
NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2020, 27 (03)