Multi-Label Conditional Generation From Pre-Trained Models

被引:0
|
作者
Proszewska, Magdalena [1 ]
Wolczyk, Maciej [1 ]
Zieba, Maciej [2 ,3 ]
Wielopolski, Patryk [4 ]
Maziarka, Lukasz [1 ]
Smieja, Marek [1 ]
机构
[1] Jagiellonian Univ, Fac Math & Comp Sci, PL-31007 Krakow, Poland
[2] Tooploox, PL-53601 Wroclaw, Poland
[3] Wroclaw Univ Sci & Technol, PL-53601 Wroclaw, Poland
[4] Wroclaw Univ Sci & Technol, PL-50370 Wroclaw, Poland
关键词
Training; Computational modeling; Adaptation models; Vectors; Data models; Aerospace electronics; Three-dimensional displays; Conditional generation; deep generative models; GANs; invertible normalizing flows; pre-trained models; VAEs;
D O I
10.1109/TPAMI.2024.3382008
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Although modern generative models achieve excellent quality in a variety of tasks, they often lack the essential ability to generate examples with requested properties, such as the age of the person in the photo or the weight of the generated molecule. To overcome these limitations we propose PluGeN (Plugin Generative Network), a simple yet effective generative technique that can be used as a plugin for pre-trained generative models. The idea behind our approach is to transform the entangled latent representation using a flow-based module into a multi-dimensional space where the values of each attribute are modeled as an independent one-dimensional distribution. In consequence, PluGeN can generate new samples with desired attributes as well as manipulate labeled attributes of existing examples. Due to the disentangling of the latent representation, we are even able to generate samples with rare or unseen combinations of attributes in the dataset, such as a young person with gray hair, men with make-up, or women with beards. In contrast to competitive approaches, PluGeN can be trained on partially labeled data. We combined PluGeN with GAN and VAE models and applied it to conditional generation and manipulation of images, chemical molecule modeling and 3D point clouds generation.
引用
收藏
页码:6185 / 6198
页数:14
相关论文
共 50 条
  • [1] A Survey on Time-Series Pre-Trained Models
    Ma, Qianli
    Liu, Zhen
    Zheng, Zhenjing
    Huang, Ziyang
    Zhu, Siying
    Yu, Zhongzhong
    Kwok, James T.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (12) : 7536 - 7555
  • [2] Pre-Trained Models for Non-Intrusive Appliance Load Monitoring
    Wang, Lingxiao
    Mao, Shiwen
    Wilamowski, Bogdan M.
    Nelms, Robert M.
    IEEE TRANSACTIONS ON GREEN COMMUNICATIONS AND NETWORKING, 2022, 6 (01): : 56 - 68
  • [3] Representation Transfer Learning via Multiple Pre-Trained Models for Linear Regression
    Singh, Navjot
    Diggavi, Suhas
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2025, 19 (01) : 208 - 220
  • [4] MtArtGPT: A Multi-Task Art Generation System With Pre-Trained Transformer
    Jin, Cong
    Zhu, Ruolin
    Zhu, Zixing
    Yang, Lu
    Yang, Min
    Luo, Jiebo
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (08) : 6901 - 6912
  • [5] Efficient Key-Based Adversarial Defense for ImageNet by Using Pre-Trained Models
    Maungmaung, Aprilpyone
    Echizen, Isao
    Kiya, Hitoshi
    IEEE OPEN JOURNAL OF SIGNAL PROCESSING, 2024, 5 : 902 - 913
  • [6] Simple and Effective Multimodal Learning Based on Pre-Trained Transformer Models
    Miyazawa, Kazuki
    Kyuragi, Yuta
    Nagai, Takayuki
    IEEE ACCESS, 2022, 10 : 29821 - 29833
  • [7] Pre-Trained Language Models and Their Applications
    Wang, Haifeng
    Li, Jiwei
    Wu, Hua
    Hovy, Eduard
    Sun, Yu
    ENGINEERING, 2023, 25 : 51 - 65
  • [8] Multi-Label Clinical Time-Series Generation via Conditional GAN
    Lu, Chang
    Reddy, Chandan K.
    Wang, Ping
    Nie, Dong
    Ning, Yue
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (04) : 1728 - 1740
  • [9] Lottery Jackpots Exist in Pre-Trained Models
    Zhang, Yuxin
    Lin, Mingbao
    Zhong, Yunshan
    Chao, Fei
    Ji, Rongrong
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (12) : 14990 - 15004
  • [10] Pre-trained models: Past, present and future
    Han, Xu
    Zhang, Zhengyan
    Ding, Ning
    Gu, Yuxian
    Liu, Xiao
    Huo, Yuqi
    Qiu, Jiezhong
    Yao, Yuan
    Zhang, Ao
    Zhang, Liang
    Han, Wentao
    Huang, Minlie
    Jin, Qin
    Lan, Yanyan
    Liu, Yang
    Liu, Zhiyuan
    Lu, Zhiwu
    Qiu, Xipeng
    Song, Ruihua
    Tang, Jie
    Wen, Ji-Rong
    Yuan, Jinhui
    Zhao, Wayne Xin
    Zhu, Jun
    AI OPEN, 2021, 2 : 225 - 250