Evaluation of Data Inconsistency for Multi-modal Sentiment Analysis

被引:0
|
作者
Wang, Yufei [1 ]
Wu, Mengyue [1 ]
机构
[1] Shanghai Jiao Tong Univ, Shanghai 200000, Peoples R China
来源
MAN-MACHINE SPEECH COMMUNICATION, NCMMSC 2024 | 2025年 / 2312卷
关键词
Multi-modal Sentiment Analysis; Multi-modal Large Language Model; Data Inconsistency;
D O I
10.1007/978-981-96-1045-7_25
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Emotion semantic inconsistency is a ubiquitous challenge in multi-modal sentiment analysis (MSA). MSA involves analyzing sentiment expressed across various modalities like text, audio, and videos. Each modality may convey distinct aspects of sentiment, due to the subtle and nuanced expression of human beings, leading to inconsistency, which may hinder the prediction of artificial agents. In this work, we introduce a modality-conflicting test set and assess the performance of both traditional multi-modal sentiment analysis models and multi-modal large language models (MLLMs). Our findings reveal significant performance degradation across traditional models when confronted with semantically conflicting data and point out the drawbacks of MLLMs when handling multi-modal emotion analysis. Our research presents a new challenge and offers valuable insights for the future development of sentiment analysis systems.
引用
收藏
页码:299 / 310
页数:12
相关论文
共 50 条
  • [1] BLR: A Multi-modal Sentiment Analysis Model
    Yang Yang
    Ye Zhonglin
    Zhao Haixing
    Li Gege
    Cao Shujuan
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PART X, 2023, 14263 : 466 - 478
  • [2] Multi-modal fusion attention sentiment analysis for mixed sentiment classification
    Xue, Zhuanglin
    Xu, Jiabin
    COGNITIVE COMPUTATION AND SYSTEMS, 2024,
  • [3] Toward's Arabic Multi-modal Sentiment Analysis
    Alqarafi, Abdulrahman S.
    Adeel, Ahsan
    Gogate, Mandar
    Dashitpour, Kia
    Hussain, Amir
    Durrani, Tariq
    COMMUNICATIONS, SIGNAL PROCESSING, AND SYSTEMS, 2019, 463 : 2378 - 2386
  • [4] Contextual Inter-modal Attention for Multi-modal Sentiment Analysis
    Ghosal, Deepanway
    Akhtar, Md Shad
    Chauhan, Dushyant
    Poria, Soujanya
    Ekbalt, Asif
    Bhattacharyyat, Pushpak
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 3454 - 3466
  • [5] Mixture of Attention Variants for Modal Fusion in Multi-Modal Sentiment Analysis
    He, Chao
    Zhang, Xinghua
    Song, Dongqing
    Shen, Yingshan
    Mao, Chengjie
    Wen, Huosheng
    Zhu, Dingju
    Cai, Lihua
    BIG DATA AND COGNITIVE COMPUTING, 2024, 8 (02)
  • [6] Tripartite interaction representation learning for multi-modal sentiment analysis
    Wang, Binqiang
    Dong, Gang
    Zhao, Yaqian
    Li, Rengang
    Yin, Wenfeng
    Lu, Lihua
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 268
  • [7] Sequential Late Fusion Technique for Multi-modal Sentiment Analysis
    Banerjee, Debapriya
    Lygerakis, Fotios
    Makedon, Fillia
    THE 14TH ACM INTERNATIONAL CONFERENCE ON PERVASIVE TECHNOLOGIES RELATED TO ASSISTIVE ENVIRONMENTS, PETRA 2021, 2021, : 264 - 265
  • [8] Multi-Modal Sentiment Analysis Based on Interactive Attention Mechanism
    Wu, Jun
    Zhu, Tianliang
    Zheng, Xinli
    Wang, Chunzhi
    APPLIED SCIENCES-BASEL, 2022, 12 (16):
  • [9] Multi-modal Sentiment Analysis using Deep Canonical Correlation Analysis
    Sun, Zhongkai
    Sarma, Prathusha K.
    Sethares, William
    Bucy, Erik P.
    INTERSPEECH 2019, 2019, : 1323 - 1327
  • [10] A Multi-Modal ELMo Model for Image Sentiment Recognition of Consumer Data
    Rong, Lu
    Ding, Yijie
    Wang, Mengyao
    El Saddik, Abdulmotaleb
    Hossain, M. Shamim
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2024, 70 (01) : 3697 - 3708