Multi-task meta label correction for time series prediction

被引:2
|
作者
Yang, Luxuan [1 ,2 ]
Gao, Ting [1 ,2 ]
Wei, Wei [3 ]
Dai, Min [4 ]
Fang, Cheng [1 ,2 ]
Duan, Jinqiao [5 ,6 ,7 ]
机构
[1] Huazhong Univ Sci & Technol, Sch Math & Stat, Wuhan 430074, Peoples R China
[2] Huazhong Univ Sci & Technol, Ctr Math Sci, Wuhan 430074, Peoples R China
[3] Shanghai Jiao Tong Univ, Inst Nat Sci, Shanghai 200240, Peoples R China
[4] Wuhan Univ Technol, Sch Sci, Wuhan 430070, Peoples R China
[5] Great Bay Univ, Dept Math, Dongguan 523000, Peoples R China
[6] Great Bay Univ, Dept Phys, Dongguan 523000, Peoples R China
[7] Dongguan Key Lab Data Sci & Intelligent Med, Dept Lab, Dongguan 523000, Peoples R China
基金
中国国家自然科学基金;
关键词
Data visualization; Bi-level optimization; Meta-learning; Multi-task learning; RECURRENCE PLOTS;
D O I
10.1016/j.patcog.2024.110319
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Time series classification faces two unavoidable problems. One is partial feature information and the other is poor label quality, which may affect model performance. To address the above issues, we create a label correction method to time series data with meta -learning under a multi -task framework. There are three main contributions. First, we train the label correction model with a two -branch neural network in the outer loop. While in the model -agnostic inner loop, we use pre-existing classification models in a multi -task way and jointly update the meta -knowledge so as to help us achieve adaptive labeling on complex time series. Second, we devise new data visualization methods for both image patterns of the historical data and data in the prediction horizon. Finally, we test our method with various financial datasets, including XOM, S&P500, and SZ50. Results show that our method is more effective and accurate than some existing label correction techniques.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Bayesian Multi-task Learning for Dynamic Time Series Prediction
    Chandra, Rohitash
    Cripps, Sally
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018, : 390 - 397
  • [2] Multi-task Modular Backpropagation for Dynamic Time Series Prediction
    Chandra, Rohitash
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [3] Co-evolutionary multi-task learning for dynamic time series prediction
    Chandra, Rohitash
    Ong, Yew-Soon
    Goh, Chi-Keong
    APPLIED SOFT COMPUTING, 2018, 70 : 576 - 589
  • [4] Deep multi-task learning model for time series prediction in wireless communication
    Cao, Kailin
    Hu, Ting
    Li, Zishuo
    Zhao, Guoshuai
    Qian, Xueming
    PHYSICAL COMMUNICATION, 2021, 44
  • [5] Multi-Task Time Series Forecasting With Shared Attention
    Chen, Zekai
    Jiaze, E.
    Zhang, Xiao
    Sheng, Hao
    Cheng, Xiuzheng
    20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW 2020), 2020, : 917 - 925
  • [6] Multi-Task Diffusion Learning for Time Series Classification
    Zheng, Shaoqiu
    Liu, Zhen
    Tian, Long
    Ye, Ling
    Zheng, Shixin
    Peng, Peng
    Chu, Wei
    ELECTRONICS, 2024, 13 (20)
  • [7] Multi-task Learning Method for Hierarchical Time Series Forecasting
    Yang, Maoxin
    Hu, Qinghua
    Wang, Yun
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: TEXT AND TIME SERIES, PT IV, 2019, 11730 : 474 - 485
  • [8] Pre-SMATS: A multi-task learning based prediction model for small multi seasonal time series
    Wu, Shiling
    Peng, Dunlu
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 201
  • [9] MultiTL-KELM: A multi-task learning algorithm for multi-step-ahead time series prediction
    Ye, Rui
    Dai, Qun
    APPLIED SOFT COMPUTING, 2019, 79 : 227 - 253
  • [10] Multi-task label noise learning for classification
    Liu, Zongmin
    Wang, Ziyi
    Wang, Ting
    Xu, Yitian
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 130