Time Series Classification With Large Language Models via Linguistic Scaffolding

被引:0
|
作者
Jang, Hyeongwon [1 ,2 ]
Yong Yang, June [2 ]
Hwang, Jaeryong [3 ]
Yang, Eunho [2 ,4 ]
机构
[1] Seoul Natl Univ, Dept Math Sci, Seoul 08826, South Korea
[2] Korea Adv Inst Sci & Technol, Kim Jaechul Grad Sch Artificial Intelligence, Daejeon ] 34141, South Korea
[3] Republ Korea Naval Acad, Dept Cyber Sci, Changwon Si 51704, South Korea
[4] AITRICS, Seoul 06221, South Korea
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Time series analysis; Transformers; Linguistics; Large language models; Semantics; Cognition; Predictive models; Vectors; Sensors; Encoding; Artificial neural networks; Classification algorithms; deep learning; irregular time series classification; pre-trained language models; time series classification;
D O I
10.1109/ACCESS.2024.3475471
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Time series classification requires specialized models that can effectively capture temporal structures. Consequently, Large Language Models (LLMs) have emerged as promising candidates due to their proficiency in sequence modeling and semantic reasoning. However, converting time series data into text results in sequences that exceed the maximum token limit, necessitating truncation or the removal of word embeddings for fixed-length time series embeddings. This restriction not only sacrifices semantic reasoning capabilities accessed through natural language but also limits the ability to handle temporal irregularities. To overcome these challenges, we propose the Language-Scaffolded Time Series Transformer (LSTST), which combines linguistic components and time series embeddings to effectively harness LLMs while overcoming dimensional constraints. Our Language Scaffold reformulates time series classification as a contextual question-answering task, with time series embeddings as context, facilitating the LLM to utilize its inherent semantic knowledge. Moreover, the preserved linguistic structure allows a dynamic number of input context embeddings with real-time positional encoding, handling length restrictions and irregularity in the temporal dimension. Through experiments, we show that LSTST achieves state-of-the-art performance on regular time series classification and also handles irregular time series without any model modifications.
引用
收藏
页码:170387 / 170398
页数:12
相关论文
共 50 条
  • [41] Generating Synthetic Resume Data with Large Language Models for Enhanced Job Description Classification
    Skondras, Panagiotis
    Zervas, Panagiotis
    Tzimas, Giannis
    FUTURE INTERNET, 2023, 15 (11)
  • [42] Exploring the Impact of Large Language Models on Disease Diagnosis
    Almubark, Ibrahim
    IEEE ACCESS, 2025, 13 : 8225 - 8238
  • [43] Integrating Graphs With Large Language Models: Methods and Prospects
    Pan, Shirui
    Zheng, Yizhen
    Liu, Yixin
    Murugesan, San
    IEEE INTELLIGENT SYSTEMS, 2024, 39 (01) : 64 - 68
  • [44] The promises of large language models for protein design and modeling
    Valentini, Giorgio
    Malchiodi, Dario
    Gliozzo, Jessica
    Mesiti, Marco
    Soto-Gomez, Mauricio
    Cabri, Alberto
    Reese, Justin
    Casiraghi, Elena
    Robinson, Peter N.
    FRONTIERS IN BIOINFORMATICS, 2023, 3
  • [45] A Study on the Representativeness Heuristics Problem in Large Language Models
    Ryu, Jongwon
    Kim, Jungeun
    Kim, Junyeong
    IEEE ACCESS, 2024, 12 : 147958 - 147966
  • [46] Unifying Large Language Models and Knowledge Graphs: A Roadmap
    Pan, Shirui
    Luo, Linhao
    Wang, Yufei
    Chen, Chen
    Wang, Jiapu
    Wu, Xindong
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (07) : 3580 - 3599
  • [47] Deep Multiple Metric Learning for Time Series Classification
    Chen, Zhi
    Liu, Yongguo
    Zhu, Jiajing
    Zhang, Yun
    Li, Qiaoqin
    Jin, Rongjiang
    He, Xia
    IEEE ACCESS, 2021, 9 : 17829 - 17842
  • [48] Time Series Forecasting via Learning Convolutionally Low-Rank Models
    Liu, Guangcan
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (05) : 3362 - 3380
  • [49] Improving Time Series Classification Using Hidden Markov Models
    Esmael, Bilal
    Arnaout, Arghad
    Fruhwirth, Rudolf K.
    Thonhauser, Gerhard
    2012 12TH INTERNATIONAL CONFERENCE ON HYBRID INTELLIGENT SYSTEMS (HIS), 2012, : 502 - 507
  • [50] Enhancing Time Series Analysis with GNN Graph Classification Models
    Romanova, Alex
    COMPLEX NETWORKS & THEIR APPLICATIONS XII, VOL 1, COMPLEX NETWORKS 2023, 2024, 1141 : 25 - 36