GAN-GLS: Generative Lyric Steganography Based on Generative Adversarial Networks

被引:13
|
作者
Wang, Cuilin [1 ]
Liu, Yuling [1 ]
Tong, Yongju [1 ]
Wang, Jingwen [2 ]
机构
[1] Hunan Univ, Coll Comp Sci & Elect Engn, Changsha 410082, Hunan, Peoples R China
[2] Elizabethtown Coll, Dept Comp Sci, Elizabethtown, PA 17022 USA
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2021年 / 69卷 / 01期
基金
中国国家自然科学基金;
关键词
Text steganography; generative adversarial networks; text generation; generated lyric; TEXT STEGANOGRAPHY; IMAGE STEGANOGRAPHY;
D O I
10.32604/cmc.2021.017950
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Steganography based on generative adversarial networks (GANs) has become a hot topic among researchers. Due to GANs being unsuitable for text fields with discrete characteristics, researchers have proposed GANbased steganography methods that are less dependent on text. In this paper, we propose a new method of generative lyrics steganography based on GANs, called GAN-GLS. The proposed method uses the GAN model and the largescale lyrics corpus to construct and train a lyrics generator. In this method, the GAN uses a previously generated line of a lyric as the input sentence in order to generate the next line of the lyric. Using a strategy based on the penalty mechanism in training, the GAN model generates non-repetitive and diverse lyrics. The secret information is then processed according to the data characteristics of the generated lyrics in order to hide information. Unlike other text generation-based linguistic steganographic methods, our method changes the way that multiple generated candidate items are selected as the candidate groups in order to encode the conditional probability distribution. The experimental results demonstrate that our method can generate highquality lyrics as stego-texts. Moreover, compared with other similar methods, the proposed method achieves good performance in terms of imperceptibility, embedding rate, effectiveness, extraction success rate and security.
引用
收藏
页码:1375 / 1390
页数:16
相关论文
共 50 条
  • [31] SymReg-GAN: Symmetric Image Registration With Generative Adversarial Networks
    Zheng, Yuanjie
    Sui, Xiaodan
    Jiang, Yanyun
    Che, Tontong
    Zhang, Shaoting
    Yang, Jie
    Li, Hongsheng
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (09) : 5631 - 5646
  • [32] QR-GAN: Generative Adversarial Networks meet Quantile Regression
    Lee, Sunyeop
    Nguyen, Than Anh
    Min, Dugki
    2023 15TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE, ICACI, 2023,
  • [33] GAN-GL: Generative Adversarial Networks for Glacial Lake Mapping
    Zhao, Hang
    Zhang, Meimei
    Chen, Fang
    REMOTE SENSING, 2021, 13 (22)
  • [34] GSA-GAN: Global Spatial Attention Generative Adversarial Networks
    An, Lei
    Zhao, Jiajia
    Ma, Bo
    NEUROCOMPUTING, 2021, 437 : 274 - 281
  • [35] TextKD-GAN: Text Generation Using Knowledge Distillation and Generative Adversarial Networks
    Haidar, Md. Akmal
    Rezagholizadeh, Mehdi
    ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, 11489 : 107 - 118
  • [36] Correction of Banding Errors in Satellite Images With Generative Adversarial Networks (GAN)
    Paola, Zarate L.
    Jesus, Lopez S.
    Christian, Arroyo H.
    Sonia, Rincon U.
    IEEE ACCESS, 2023, 11 : 51960 - 51970
  • [37] Exploring generative adversarial networks and adversarial training
    Sajeeda A.
    Hossain B.M.M.
    Int. J. Cogn. Comp. Eng., (78-89): : 78 - 89
  • [38] Age Estimation Method Based on Generative Adversarial Networks
    Ning, Xin
    Li, Weijun
    Sun, Linjun
    2ND INTERNATIONAL CONFERENCE ON COMPUTER ENGINEERING, INFORMATION SCIENCE AND INTERNET TECHNOLOGY, CII 2017, 2017, : 333 - 340
  • [39] Texture synthesis method based on generative adversarial networks
    Yu S.
    Han Z.
    Tang Y.
    Wu C.
    Hongwai yu Jiguang Gongcheng/Infrared and Laser Engineering, 2018, 47 (02):
  • [40] Generative Adversarial Network (GAN) for Simulating Electroencephalography
    Mahey, Priyanshu
    Toussi, Nima
    Purnomu, Grace
    Herdman, Anthony Thomas
    BRAIN TOPOGRAPHY, 2023, 36 (05) : 661 - 670