Synthetic Time Series Generation for Decision Intelligence Using Large Language Models

被引:2
作者
Grigoras, Alexandru [1 ]
Leon, Florin [1 ]
机构
[1] Gheorghe Asachi Tech Univ Iasi, Fac Automat Control & Comp Engn, Bd Mangeron 27, Iasi 700050, Romania
关键词
transformer architecture; large language models; synthetic data; time series; decision intelligence;
D O I
10.3390/math12162494
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
A model for generating synthetic time series data using pre-trained large language models is proposed. Starting with the Google T5-base model, which employs an encoder-decoder transformer architecture, the model underwent pre-training on diverse datasets. It was then fine-tuned using the QLoRA technique, which reduces computational complexity by quantizing weight parameters. The process involves the tokenization of time series data through mean scaling and quantization. The performance of the model was evaluated with fidelity, utility, and privacy metrics, showing improvements in fidelity and utility but a trade-off with reduced privacy. The proposed model offers a foundation for decision intelligence systems.
引用
收藏
页数:17
相关论文
共 45 条
[11]   Generative Adversarial Networks [J].
Goodfellow, Ian ;
Pouget-Abadie, Jean ;
Mirza, Mehdi ;
Xu, Bing ;
Warde-Farley, David ;
Ozair, Sherjil ;
Courville, Aaron ;
Bengio, Yoshua .
COMMUNICATIONS OF THE ACM, 2020, 63 (11) :139-144
[12]  
Haddad F., 2022, How to Evaluate the Quality of the Synthetic Data-Measuring from the Perspective of Fidelity, Utility, and Privacy
[13]  
Hameed R., 2023, ACM Trans. Asian Low-Resour. Lang. Inf. Process, V37, P4
[14]   Synthetic data generation for tabular health records: A review [J].
Hernandez, Mikel ;
Epelde, Gorka ;
Alberdi, Ane ;
Cilla, Rodrigo ;
Rankin, Debbie .
NEUROCOMPUTING, 2022, 493 :28-45
[15]  
Howard A., M5 Forecasting-Accuracy
[16]  
Hui Jonathan., 2018, GAN - Why it is so hard to train Generative Adversarial Networks!
[17]  
Jin M, 2024, Arxiv, DOI arXiv:2402.02713
[18]  
Jin M, 2024, Arxiv, DOI arXiv:2310.01728
[19]  
Junkai L., 2024, Modeling Time Series as Text Sequence A Frequency-Vectorization Transformer for Time Series Forecasting
[20]  
Kuchaiev O., 2019, arXiv