Temporal-Frequency Co-training for Time Series Semi-supervised Learning

被引:0
|
作者
Liu, Zhen [1 ]
Ma, Qianli [1 ,2 ]
Ma, Peitian [1 ]
Wang, Linghao [1 ]
机构
[1] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou, Peoples R China
[2] South China Univ Technol, Key Lab Big Data & Intelligent Robot, Minist Educ, Guangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Semi-supervised learning (SSL) has been actively studied due to its ability to alleviate the reliance of deep learning models on labeled data. Although existing SSL methods based on pseudo-labeling strategies have made great progress, they rarely consider time-series data's intrinsic properties (e.g., temporal dependence). Learning representations by mining the inherent properties of time series has recently gained much attention. Nonetheless, how to utilize feature representations to design SSL paradigms for time series has not been explored. To this end, we propose a Time Series SSL framework via Temporal-Frequency Co-training (TS-TFC), leveraging the complementary information from two distinct views for unlabeled data learning. In particular, TS-TFC employs time-domain and frequency-domain views to train two deep neural networks simultaneously, and each view's pseudo-labels generated by label propagation in the representation space are adopted to guide the training of the other view's classifier. To enhance the discriminative of representations between categories, we propose a temporal-frequency supervised contrastive learning module, which integrates the learning difficulty of categories to improve the quality of pseudo-labels. Through co-training the pseudo-labels obtained from temporal-frequency representations, the complementary information in the two distinct views is exploited to enable the model to better learn the distribution of categories. Extensive experiments on 106 UCR datasets show that TS-TFC outperforms state-of-the-art methods, demonstrating the effectiveness and robustness of our proposed model.
引用
收藏
页码:8923 / 8931
页数:9
相关论文
共 50 条
  • [41] A Data Stratification Process for Instances Selection Applied to Co-training Semi-supervised Learning Algorithm
    Araujo, Yago N.
    Vale, Karliane M. O.
    Gorgonio, Flavius L.
    Canuto, Anne Magaly de P.
    Gorgonio, Arthur Costa
    Barreto, Cephas A. da S.
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [42] COLTR: Semi-Supervised Learning to Rank With Co-Training and Over-Parameterization for Web Search
    Li, Yuchen
    Xiong, Haoyi
    Wang, Qingzhong
    Kong, Linghe
    Liu, Hao
    Li, Haifang
    Bian, Jiang
    Wang, Shuaiqiang
    Chen, Guihai
    Dou, Dejing
    Yin, Dawei
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (12) : 12542 - 12555
  • [43] Supervised learning and Co-training
    Darnstaedt, Malte
    Simon, Hans Ulrich
    Szoerenyi, Balazs
    THEORETICAL COMPUTER SCIENCE, 2014, 519 : 68 - 87
  • [44] Supervised Learning and Co-training
    Darnstaedt, Malte
    Simon, Hans Ulrich
    Balazs Szoerenyi
    ALGORITHMIC LEARNING THEORY, 2011, 6925 : 425 - +
  • [45] COS-training: A new semi-supervised learning method for keyphrase extraction based on co-training and SMOTE
    Wang, Gang
    Yang, Shanlin
    Ma, Jian
    ICIC Express Letters, Part B: Applications, 2015, 6 (01): : 233 - 238
  • [46] Using co-training and self-training in semi-supervised multiple classifier systems
    Didaci, Luca
    Roli, Fabio
    STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, PROCEEDINGS, 2006, 4109 : 522 - 530
  • [47] Fast Co-MLM: An Efficient Semi-supervised Co-training Method Based on the Minimal Learning Machine
    Weslley L. Caldas
    João P. P. Gomes
    Diego P. P. Mesquita
    New Generation Computing, 2018, 36 : 41 - 58
  • [48] Fast Co-MLM: An Efficient Semi-supervised Co-training Method Based on the Minimal Learning Machine
    Caldas, Weslley L.
    Gomes, Joao P. P.
    Mesquita, Diego P. P.
    NEW GENERATION COMPUTING, 2018, 36 (01) : 41 - 58
  • [49] DISCo: Distilled Student Models Co-training for Semi-supervised Text Mining
    Jiang, Weifeng
    Mao, Qianren
    Lin, Chenghua
    Li, Jianxin
    Deng, Ting
    Yang, Weiyi
    Wang, Zheng
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 4015 - 4030
  • [50] Abnormal Voice Detection Algorithm Based on Semi-supervised Co-training Algorithm
    Zhao, YaHui
    Wang, HongLi
    Cui, RongYi
    ADVANCED BUILDING MATERIALS AND STRUCTURAL ENGINEERING, 2012, 461 : 117 - 122