A GAN Model With Self-attention Mechanism To Generate Multi-instruments Symbolic Music

被引:0
作者
Guan, Faqian [1 ]
Yu, Chunyan [1 ]
Yang, Suqiong [1 ]
机构
[1] Fuzhou Univ, Coll Math & Comp Sci, Fuzhou, Peoples R China
来源
2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2019年
关键词
symbolic music generation; Generative Adversarial Networks; multi-instruments; switchable normalization; self-attention mechanism;
D O I
10.1109/ijcnn.2019.8852291
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
GAN has recently been proved to be able to generate symbolic music in the form of piano-rolls. However, those existing GAN-based multi-track music generation methods are always unstable. Moreover, due to defects in the temporal features extraction, the generated multi-track music does not sound natural enough. Therefore, we propose a new GAN model with self-attention mechanism, DMB-GAN, which can extract more temporal features of music to generate multi-instruments music stably. First of all, to generate more consistent and natural single-track music, we introduce self-attention mechanism to enable GAN-based music generation model to extract not only spatial features but also temporal features. Secondly, to generate multi-instruments music with harmonic structure among all tracks, we construct a dual generative adversarial architecture with multi-branches, each branch for one track. Finally, to improve generated quality of multi-instruments symbolic music, we introduce switchable normalization to stabilize network training. The experimental results show that DMB-GAN can stably generate coherent, natural multi-instruments music with good quality.
引用
收藏
页数:6
相关论文
共 50 条
  • [41] An ensemble of CNNs with self-attention mechanism for DeepFake video detection
    Karima Omar
    Rasha H. Sakr
    Mohammed F. Alrahmawy
    [J]. Neural Computing and Applications, 2024, 36 : 2749 - 2765
  • [42] Oil well production prediction based on CNN-LSTM model with self-attention mechanism
    Pan, Shaowei
    Yang, Bo
    Wang, Shukai
    Guo, Zhi
    Wang, Lin
    Liu, Jinhua
    Wu, Siyu
    [J]. ENERGY, 2023, 284
  • [43] FST-OAM: a fast style transfer model using optimized self-attention mechanism
    Du, Xiaozhi
    Jia, Ning
    Du, Hongyuan
    [J]. SIGNAL IMAGE AND VIDEO PROCESSING, 2024, 18 (05) : 4191 - 4203
  • [44] ResDeepSurv: A Survival Model for Deep Neural Networks Based on Residual Blocks and Self-attention Mechanism
    Wang, Yuchen
    Kong, Xianchun
    Bi, Xiao
    Cui, Lizhen
    Yu, Hong
    Wu, Hao
    [J]. INTERDISCIPLINARY SCIENCES-COMPUTATIONAL LIFE SCIENCES, 2024, 16 (02) : 405 - 417
  • [45] A Bearing Fault Diagnosis Method Based on Dilated Convolution and Multi-Head Self-Attention Mechanism
    Hou, Peng
    Zhang, Jianjie
    Jiang, Zhangzheng
    Tang, Yiyu
    Lin, Ying
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (23):
  • [46] Dense Depth Completion Based on Multi-Scale Confidence and Self-Attention Mechanism for Intestinal Endoscopy
    Liu, Ruyu
    Liu, Zhengzhe
    Zhang, Haoyu
    Zhang, Guodao
    Zuo, Zhigui
    Sheng, Weiguo
    [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2023), 2023, : 7476 - 7482
  • [47] Uniting Multi-Scale Local Feature Awareness and the Self-Attention Mechanism for Named Entity Recognition
    Shi, Lin
    Zou, Xianming
    Dai, Chenxu
    Ji, Zhanlin
    [J]. MATHEMATICS, 2023, 11 (11)
  • [48] Evaluating the effectiveness of self-attention mechanism in tuberculosis time series forecasting
    Lv, Zhihong
    Sun, Rui
    Liu, Xin
    Wang, Shuo
    Guo, Xiaowei
    Lv, Yuan
    Yao, Min
    Zhou, Junhua
    [J]. BMC INFECTIOUS DISEASES, 2024, 24 (01)
  • [49] Fusion of self-attention mechanism for CSI feedback in massive MIMO systems
    Zhang, Tingting
    Xu, Youyun
    Zhou, Changpeng
    [J]. ICT EXPRESS, 2025, 11 (01): : 124 - 128
  • [50] Combined Self-Attention Mechanism for Chinese Named Entity Recognition in Military
    Liao, Fei
    Ma, Liangli
    Pei, Jingjing
    Tan, Linshan
    [J]. FUTURE INTERNET, 2019, 11 (08):