Time Series Classification With Large Language Models via Linguistic Scaffolding

被引:0
|
作者
Jang, Hyeongwon [1 ,2 ]
Yong Yang, June [2 ]
Hwang, Jaeryong [3 ]
Yang, Eunho [2 ,4 ]
机构
[1] Seoul Natl Univ, Dept Math Sci, Seoul 08826, South Korea
[2] Korea Adv Inst Sci & Technol, Kim Jaechul Grad Sch Artificial Intelligence, Daejeon ] 34141, South Korea
[3] Republ Korea Naval Acad, Dept Cyber Sci, Changwon Si 51704, South Korea
[4] AITRICS, Seoul 06221, South Korea
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Time series analysis; Transformers; Linguistics; Large language models; Semantics; Cognition; Predictive models; Vectors; Sensors; Encoding; Artificial neural networks; Classification algorithms; deep learning; irregular time series classification; pre-trained language models; time series classification;
D O I
10.1109/ACCESS.2024.3475471
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Time series classification requires specialized models that can effectively capture temporal structures. Consequently, Large Language Models (LLMs) have emerged as promising candidates due to their proficiency in sequence modeling and semantic reasoning. However, converting time series data into text results in sequences that exceed the maximum token limit, necessitating truncation or the removal of word embeddings for fixed-length time series embeddings. This restriction not only sacrifices semantic reasoning capabilities accessed through natural language but also limits the ability to handle temporal irregularities. To overcome these challenges, we propose the Language-Scaffolded Time Series Transformer (LSTST), which combines linguistic components and time series embeddings to effectively harness LLMs while overcoming dimensional constraints. Our Language Scaffold reformulates time series classification as a contextual question-answering task, with time series embeddings as context, facilitating the LLM to utilize its inherent semantic knowledge. Moreover, the preserved linguistic structure allows a dynamic number of input context embeddings with real-time positional encoding, handling length restrictions and irregularity in the temporal dimension. Through experiments, we show that LSTST achieves state-of-the-art performance on regular time series classification and also handles irregular time series without any model modifications.
引用
收藏
页码:170387 / 170398
页数:12
相关论文
共 50 条
  • [21] Multimodal Food Image Classification with Large Language Models
    Kim, Jun-Hwa
    Kim, Nam-Ho
    Jo, Donghyeok
    Won, Chee Sun
    ELECTRONICS, 2024, 13 (22)
  • [22] TS-HTFA: Advancing Time-Series Forecasting via Hierarchical Text-Free Alignment with Large Language Models
    Wang, Pengfei
    Zheng, Huanran
    Xu, Qi'ao
    Dai, Silong
    Wang, Yiqiao
    Yue, Wenjing
    Zhu, Wei
    Qian, Tianwen
    Zhao, Liang
    SYMMETRY-BASEL, 2025, 17 (03):
  • [23] Toward Interactive Next Location Prediction Driven by Large Language Models
    Chen, Yong
    Chi, Ben
    Li, Chuanjia
    Zhang, Yuliang
    Liao, Chenlei
    Chen, Xiqun
    Xie, Na
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2025,
  • [24] Understanding the Role of Large Language Models in Personalizing and Scaffolding Strategies to Combat Academic Procrastination
    Bhattacharjee, Ananya
    Zeng, Yuchen
    Xu, Sarah Yi
    Kulzhabayeva, Dana
    Ma, Minyi
    Kornfeld, Rachel
    Ahmed, Syed Ishtiaque
    Mariakakis, Alex
    Czerwinski, Mary P.
    Kuzminykh, Anastasia
    Liut, Michael
    Williams, Joseph Jay
    PROCEEDINGS OF THE 2024 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYTEMS, CHI 2024, 2024,
  • [25] Selenite: Scaffolding Online Sensemaking with Comprehensive Overviews Elicited from Large Language Models
    Liu, Michael Xieyang
    Wu, Tongshuang
    Chen, Tianying
    Li, Franklin Mingzhe
    Kittur, Aniket
    Myers, Brad A.
    PROCEEDINGS OF THE 2024 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYTEMS, CHI 2024, 2024,
  • [26] Time series classification: nearest neighbor versus deep learning models
    Jiang, Weiwei
    SN APPLIED SCIENCES, 2020, 2 (04):
  • [27] Time series classification: nearest neighbor versus deep learning models
    Weiwei Jiang
    SN Applied Sciences, 2020, 2
  • [28] A Survey on Time-Series Pre-Trained Models
    Ma, Qianli
    Liu, Zhen
    Zheng, Zhenjing
    Huang, Ziyang
    Zhu, Siying
    Yu, Zhongzhong
    Kwok, James T.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (12) : 7536 - 7555
  • [29] Improving Vision-Language Models With Attention Mechanisms for Aerial Video Classification
    Tu, Nguyen Anh
    Aikyn, Nartay
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2025, 22
  • [30] Assessing Inference Time in Large Language Models
    Walkowiak, Bartosz
    Walkowiak, Tomasz
    SYSTEM DEPENDABILITY-THEORY AND APPLICATIONS, DEPCOS-RELCOMEX 2024, 2024, 1026 : 296 - 305