TTVAE: Transformer-based generative modeling for tabular data generation

被引:1
|
作者
Wang, Alex X. [1 ]
Nguyen, Binh P. [1 ,2 ]
机构
[1] Victoria Univ Wellington, Sch Math & Stat, Wellington 6012, New Zealand
[2] Ho Chi Minh City Open Univ, Fac Informat Technol, 97 Vo Van Tan,Dist 3, Ho Chi Minh City 70000, Vietnam
关键词
Generative AI; Tabular data; Transformer; Latent space interpolation; SMOTE;
D O I
10.1016/j.artint.2025.104292
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tabular data synthesis presents unique challenges, with Transformer models remaining underexplored despite the applications of Variational Autoencoders and Generative Adversarial Networks. To address this gap, we propose the Transformer-based Tabular Variational AutoEncoder (TTVAE), leveraging the attention mechanism for capturing complex data distributions. The inclusion of the attention mechanism enables our model to understand complex relationships among heterogeneous features, a task often difficult for traditional methods. TTVAE facilitates the integration of interpolation within the latent space during the data generation process. Specifically, TTVAE is trained once, establishing a low-dimensional representation of real data, and then various latent interpolation methods can efficiently generate synthetic latent points. Through extensive experiments on diverse datasets, TTVAE consistently achieves state-of-the-art performance, highlighting its adaptability across different feature types and data sizes. This innovative approach, empowered by the attention mechanism and the integration of interpolation, addresses the complex challenges of tabular data synthesis, establishing TTVAE as a powerful solution.
引用
收藏
页数:17
相关论文
共 50 条
  • [21] Challenges and opportunities of generative models on tabular data
    Wang, Alex X.
    Chukova, Stefanka S.
    Simpson, Colin R.
    Nguyen, Binh P.
    APPLIED SOFT COMPUTING, 2024, 166
  • [22] Improving Rumor Detection by Promoting Information Campaigns With Transformer-Based Generative Adversarial Learning
    Ma, Jing
    Li, Jun
    Gao, Wei
    Yang, Yang
    Wong, Kam-Fai
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (03) : 2657 - 2670
  • [23] SID-TGAN: A Transformer-Based Generative Adversarial Network for Sonar Image Despeckling
    Zhou, Xin
    Tian, Kun
    Zhou, Zihan
    Ning, Bo
    Wang, Yanhao
    REMOTE SENSING, 2023, 15 (20)
  • [24] TTS-GAN: A Transformer-Based Time-Series Generative Adversarial Network
    Li, Xiaomin
    Metsis, Vangelis
    Wang, Huangyingrui
    Ngu, Anne Hee Hiong
    ARTIFICIAL INTELLIGENCE IN MEDICINE, AIME 2022, 2022, 13263 : 133 - 143
  • [25] Artifact suppression for sparse view CT via transformer-based generative adversarial network
    Zhang, Tingyu
    Liu, Jin
    Wu, Fan
    Wang, Kun
    Huang, Subin
    Zhang, Yikun
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 95
  • [26] Enhancing Tabular Data Generation With Dual-Scale Noise Modeling
    Zhang, Xiaorong
    Li, Fei
    Hu, Xuting
    IEEE ACCESS, 2025, 13 : 48643 - 48655
  • [27] SOTitle: A Transformer-based Post Title Generation Approach for Stack Overflow
    Liu, Ke
    Yang, Guang
    Chen, Xiang
    Yu, Chi
    2022 IEEE INTERNATIONAL CONFERENCE ON SOFTWARE ANALYSIS, EVOLUTION AND REENGINEERING (SANER 2022), 2022, : 577 - 588
  • [28] Transformer-Based Method for Unsupervised Anomaly Detection of Flight Data
    Yu, Hao
    Wu, Honglan
    Sun, Youchao
    Liu, Hao
    2023 ASIA-PACIFIC INTERNATIONAL SYMPOSIUM ON AEROSPACE TECHNOLOGY, VOL I, APISAT 2023, 2024, 1050 : 1816 - 1826
  • [29] An Empirical Study of Code Smells in Transformer-based Code Generation Techniques
    Siddiq, Mohammed Latif
    Majumder, Shafayat H.
    Mim, Maisha R.
    Jajodia, Sourov
    Santos, Joanna C. S.
    2022 IEEE 22ND INTERNATIONAL WORKING CONFERENCE ON SOURCE CODE ANALYSIS AND MANIPULATION (SCAM 2022), 2022, : 71 - 82
  • [30] SeTransformer: A Transformer-Based Code Semantic Parser for Code Comment Generation
    Li, Zheng
    Wu, Yonghao
    Peng, Bin
    Chen, Xiang
    Sun, Zeyu
    Liu, Yong
    Paul, Doyle
    IEEE TRANSACTIONS ON RELIABILITY, 2023, 72 (01) : 258 - 273