Time-series domain adaptation via sparse associative structure alignment: Learning invariance and variance

被引:0
|
作者
Li, Zijian [1 ,2 ]
Cai, Ruichu [1 ]
Chen, Jiawei [1 ]
Yan, Yuguang [1 ]
Chen, Wei [1 ]
Zhang, Keli [3 ]
Ye, Junjian [3 ]
机构
[1] Guangdong Univ Technol, Guangzhou 510006, Guangdong, Peoples R China
[2] Mohamed Bin Zayed Univ Artificial Intelligence, Abu Dhabi, U Arab Emirates
[3] Huawei Noahs Ark Lab, Shenzhen 518116, Guangdong, Peoples R China
关键词
Time series data; Time series domain adaptation; Transfer learning; MODEL;
D O I
10.1016/j.neunet.2024.106659
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Domain adaptation on time-series data, which is often encountered in the field of industry, like anomaly detection and sensor data forecasting, but received limited attention in academia, is an important but challenging task in real-world scenarios. Most of the existing methods for time-series data use the covariate shift assumption for non-time-series data to extract the domain-invariant representation, but this assumption is hard to meet in practice due to the complex dependence among variables and a small change of the time lags may lead to a huge change of future values. To address this challenge, we leverage the stableness of causal structures among different domains. To further avoid the strong assumptions in causal discovery like linear non-Gaussian assumption, we relax it to mine the stable sparse associative structures instead of discovering the causal structures directly. Besides the domain-invariant structures, we also find that some domain-specific information like the strengths of the structures is important for prediction. Based on the aforementioned intuition, we extend the sparse associative structure alignment model in the conference version to the S parse A ssociative S tructure A lignment model with domain-specific information enhancement ( SASA2 in short), which aligns the invariant unweighted spare associative structures and considers the variant information for time- series unsupervised domain adaptation. Specifically, we first generate the segment set to exclude the obstacle of offsets. Second, we extract the unweighted sparse associative structures via sparse attention mechanisms. Third, we extract the domain-specific information via an autoregressive module. Finally, we employ a unidirectional alignment restriction to guide the transformation from the source to the target. Moreover, we further provide a generalization analysis to show the theoretical superiority of our method. Compared with existing methods, our method yields state-of-the-art performance, with a 5% relative improvement in three real-world datasets, covering different applications: air quality, in-hospital healthcare, and anomaly detection. Furthermore, visualization results of sparse associative structures illustrate what knowledge can be transferred, boosting the transparency and interpretability of our method.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Domain Adaptation for Time Series Forecasting via Attention Sharing
    Jin, Xiaoyong
    Park, Youngsuk
    Maddix, Danielle C.
    Wang, Hao
    Wang, Yuyang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022, : 10280 - 10297
  • [22] Domain Adaptation with Representation Learning and Nonlinear Relation for Time Series
    Hussein, Amir
    Hajj, Hazem
    ACM TRANSACTIONS ON INTERNET OF THINGS, 2022, 3 (02):
  • [23] Domain Adaptation with Representation Learning and Nonlinear Relation for Time Series
    Hussein, Amir
    Hajj, Hazem
    ACM Transactions on Internet of Things, 2022, 3 (02):
  • [24] Domain Generalization for Time-Series Forecasting via Extended Domain-Invariant Representations
    Shi, Yunchuan
    Li, Wei
    Zomaya, Albert Y.
    2024 IEEE ANNUAL CONGRESS ON ARTIFICIAL INTELLIGENCE OF THING, AIOT 2024, 2024, : 110 - 116
  • [25] Rumor Detection on Time-Series of Tweets via Deep Learning
    Kotteti, Chandra Mouli Madhav
    Dong, Xishuang
    Qian, Lijun
    MILCOM 2019 - 2019 IEEE MILITARY COMMUNICATIONS CONFERENCE (MILCOM), 2019,
  • [26] Time-Series Representation Learning via Temporal and Contextual Contrasting
    Eldele, Emadeldeen
    Ragab, Mohamed
    Chen, Zhenghua
    Wu, Min
    Kwoh, Chee Keong
    Li, Xiaoli
    Guan, Cuntai
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2352 - 2359
  • [27] Dynamic sparse coding for sparse time-series modeling via first-order smooth optimization
    Minyoung Kim
    Applied Intelligence, 2018, 48 : 3889 - 3901
  • [29] Cross-domain Meta-learning for Time-series Forecasting
    Ali, Abbas Raza
    Gabrys, Bogdan
    Budka, Marcin
    KNOWLEDGE-BASED AND INTELLIGENT INFORMATION & ENGINEERING SYSTEMS (KES-2018), 2018, 126 : 9 - 18
  • [30] Time-series representation learning via Time-Frequency Fusion Contrasting
    Zhao, Wenbo
    Fan, Ling
    FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2024, 7