Multimodal emotion recognition algorithm based on edge network emotion element compensation and data fusion

被引:1
作者
Yu Wang
机构
[1] Henan University of Engineering,College of Computer Science
[2] State Key Laboratory of Mathematical Engineering and Advanced Computing,undefined
来源
Personal and Ubiquitous Computing | 2019年 / 23卷
关键词
Emotion recognition; Edge network; Multimodal; Emotion compensation; Emotion recognition; Data fusion;
D O I
暂无
中图分类号
学科分类号
摘要
The data feature set of emotion recognition based on complex network has the characteristics of complex redundant information, difficult recognition and lost data, so it will cause great interference to the emotion feature of speech or image recognition. In order to solve the above problems, this paper studies the multi-modal emotion recognition algorithm based on emotion element compensation in the background of streaming media communication in edge network. Firstly, an edge streaming media network is designed to transfer the traditional server-centric transmission tasks to edge nodes. The architecture can transform complex network problems into edge nodes and user side problems. Secondly, the multi-modal parallel training is realized by using the cooperative combination of weights equalization, and the reasoning of nonlinear mapping is mapped to a better emotional data fusion relationship. Then, from the point of view of non-linearity and uncertainty of different types of emotional data samples in the training subset, emotional recognition data compensation evolves into emotional element compensation, which is convenient for qualitative analysis and optimal decision-making. Finally, the simulation results show that the proposed multi-modal emotion recognition algorithm can improve the recognition rate by 3.5%, save the average response time by 5.7% and save the average number of iterations per unit time by 1.35 times.
引用
收藏
页码:383 / 392
页数:9
相关论文
共 21 条
[1]  
Hossain MS(2016)Audio-visual emotion recognition using big data towards 5G[J] Mobile Netw Appl 21 1-11
[2]  
Muhammad G(2016)Video and group-level emotion recognition challenges[C]//ACM international conference on multimodal interaction ACM 2016 427-432
[3]  
Alhamid MF(2017)Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks [J] IEEE Trans Auton Ment Dev 7 162-175
[4]  
Dhall A(2018)Association between schizophrenia polygenic risk and neural correlates of emotion perception Psychiatry Res Neuroimaging 276 33-40
[5]  
Goecke R(2018)Small economic losses lower total compensation for victims of emotional losses [J] Organ Behav Hum Decis Processes 144 1-10
[6]  
Joshi J(2016)Causal inference and the data-fusion problem [J] Proc Natl Acad Sci U S A 113 7345-7352
[7]  
Emoti W(2016)Data fusion and remote sensing: an ever-growing relationship [J] IEEE Geosci Remote Sens Mag 4 6-23
[8]  
Zheng WL(2016)Data fusion algorithm for macroscopic fundamental diagram estimation [J] Transp Res Part C Emerg Technol 71 184-197
[9]  
Lu BL(undefined)undefined undefined undefined undefined-undefined
[10]  
Dzafic I(undefined)undefined undefined undefined undefined-undefined