Human-computer Interaction Based Music Emotion Visualization System and User Experience Assessment

被引:0
作者
He R. [1 ]
Geng M. [1 ]
Guo J. [1 ]
机构
[1] The Conservatory of Music, Hebei Institute of Communications, Hebei, Shijiazhuang
来源
Computer-Aided Design and Applications | 2024年 / 21卷 / S7期
关键词
CAD; Emotion Recognition; Human-Computer Interaction; Music Visualization;
D O I
10.14733/cadaps.2024.S7.133-147
中图分类号
学科分类号
摘要
The perceptual and auditory standard of music is deeply integrated with the emerging multimedia to a higher degree, thus forming the music visualization. It is a process presentation method, which provides a brand-new way of interpretation and deduction for music appreciation. In this article, the application of computer aided design (CAD) in music emotion visualization system is studied, and a mapping model between music characteristics and emotion for digital music emotion recognition is constructed by combining with convolutional neural network (CNN). Combined with CAD technology, the structural music features are extracted and calculated, and the main melody and auxiliary melody of music are obtained. Then, based on the separated main melody and auxiliary melody, comprehensive visualization design is carried out to realize the visualization method of highlighting the main melody. In the experimental part, the performance of music emotion recognition algorithm is tested and the user experience is assessed. The results show that the simulation accuracy and user interaction experience of this system have achieved good results, which can improve the interaction between CAD design and viewing of music emotion visualization. Compared with the recurrent neural network (RNN), support vector machine (SVM) and other emotion recognition models, this model has a higher recognition rate of music emotion, which is of great significance to the research of music emotion visualization system. © 2024 U-turn Press LLC.
引用
收藏
页码:133 / 147
页数:14
相关论文
共 16 条
  • [1] Bao G., Yang K., Tong L., Shu J., Zhang R., Wang L., Yan B., Zeng Y., Linking multi-layer dynamical GCN with style-based recalibration CNN for Eeg-based emotion recognition, Front Neurorobot, 2, 24, (2022)
  • [2] Cai L., Hu Y., Dong J., Audio-textual emotion recognition based on improved neural networks, Mathematical Problems in Engineering, 2019, 6, pp. 1-9, (2019)
  • [3] Correia N., Tanaka A., From GUI to AVUI: situating audiovisual user interfaces within human-computer interaction and related fields, EAI Endorsed Transactions on Creative Technologies, 8, 27, pp. 1-9, (2021)
  • [4] Dondi P., Porta M., Gaze-based human–computer interaction for museums and exhibitions: technologies, Applications and Future Perspectives, Electronics, 12, 14, (2023)
  • [5] Dong Y., Yang X., Zhao X., Bidirectional convolutional recurrent sparse network (bcrsn): an efficient model for music emotion recognition, IEEE Transactions on Multimedia, 21, 12, pp. 3150-3163, (2019)
  • [6] Han J., Research on layout optimization of human-computer interaction interface of electronic music products based on ERP technology, International Journal of Product Development, 27, 1-2, pp. 126-139, (2023)
  • [7] Liang Y., Willemsen M.-C., Promoting music exploration through personalized nudging in a genre exploration recommender, International Journal of Human–Computer Interaction, 39, 7, pp. 1495-1518, (2023)
  • [8] Liao N.-J., Research on intelligent interactive music information based on visualization technology, Journal of Intelligent Systems, 31, 1, pp. 289-297, (2022)
  • [9] Lin W.-Q., Chao L., Zhang Y.-J., Emotion visualization system based on physiological signals combined with the picture and scene, Information Visualization, 21, 4, pp. 393-404, (2022)
  • [10] Lv Z., Poiesi F., Dong Q., Lloret J., Song H., Deep learning for intelligent human– computer interaction, Applied Sciences, 12, 22, (2022)