Riding feeling recognition based on multi-head self-attention LSTM for driverless automobile

被引:1
|
作者
Tang, Xianzhi [1 ]
Xie, Yongjia [1 ]
Li, Xinlong [1 ]
Wang, Bo [1 ]
机构
[1] Yanshan Univ, Sch Vehicles & Energy, Hebei Key Lab Special Carrier Equipment, Hebei St, Qinhuangdao 066004, Hebei, Peoples R China
基金
中国国家自然科学基金;
关键词
Electroencephalography (EEG); Attention; Feature extraction; Driving experience;
D O I
10.1016/j.patcog.2024.111135
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the emergence of driverless technology, passenger ride comfort has become an issue of concern. In recent years, driving fatigue detection and braking sensation evaluation based on EEG signals have received more attention, and analyzing ride comfort using EEG signals is also a more intuitive method. However, it is still a challenge to find an effective method or model to evaluate passenger comfort. In this paper, we propose a longand short-term memory network model based on a multiple self-attention mechanism for passenger comfort detection. By applying the multiple attention mechanism to the feature extraction process, more efficient classification results are obtained. The results show that the long- and short-term memory network using the multihead self-attention mechanism is efficient in decision making along with higher classification accuracy. In conclusion, the classifier based on the multi-head attention mechanism proposed in this paper has excellent performance in EEG classification of different emotional states, and has a broad development prospect in braincomputer interaction.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Arrhythmia classification algorithm based on multi-head self-attention mechanism
    Wang, Yue
    Yang, Guanci
    Li, Shaobo
    Li, Yang
    He, Ling
    Liu, Dan
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 79
  • [2] Multi-Head Self-Attention Transformation Networks for Aspect-Based Sentiment Analysis
    Lin, Yuming
    Wang, Chaoqiang
    Song, Hao
    Li, You
    IEEE ACCESS, 2021, 9 : 8762 - 8770
  • [3] Hunt for Unseen Intrusion: Multi-Head Self-Attention Neural Detector
    Seo, Seongyun
    Han, Sungmin
    Park, Janghyeon
    Shim, Shinwoo
    Ryu, Han-Eul
    Cho, Byoungmo
    Lee, Sangkyun
    IEEE ACCESS, 2021, 9 : 129635 - 129647
  • [4] A Speech Recognition Model Building Method Combined Dynamic Convolution and Multi-Head Self-Attention Mechanism
    Liu, Wei
    Sun, Jiaming
    Sun, Yiming
    Chen, Chunyi
    ELECTRONICS, 2022, 11 (10)
  • [5] AttenEpilepsy: A 2D convolutional network model based on multi-head self-attention
    Ma, Shuang
    Wang, Haifeng
    Yu, Zhihao
    Du, Luyao
    Zhang, Ming
    Fu, Qingxi
    ENGINEERING ANALYSIS WITH BOUNDARY ELEMENTS, 2024, 169
  • [6] Integration of Multi-Head Self-Attention and Convolution for Person Re-Identification
    Zhou, Yalei
    Liu, Peng
    Cui, Yue
    Liu, Chunguang
    Duan, Wenli
    SENSORS, 2022, 22 (16)
  • [7] Multi-Head Self-Attention for 3D Point Cloud Classification
    Gao, Xue-Yao
    Wang, Yan-Zhao
    Zhang, Chun-Xiang
    Lu, Jia-Qi
    IEEE ACCESS, 2021, 9 : 18137 - 18147
  • [8] Automatic segmentation of golden pomfret based on fusion of multi-head self-attention and channel-attention mechanism
    Yu, Guoyan
    Luo, Yingtong
    Deng, Ruoling
    COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2022, 202
  • [9] Multi-head self-attention mechanism-based global feature learning model for ASD diagnosis
    Zhao, Feng
    Feng, Fan
    Ye, Shixin
    Mao, Yanyan
    Chen, Xiaobo
    Li, Yuan
    Ning, Mao
    Zhang, Mingli
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 91
  • [10] A Multi-Head Self-Attention Transformer-Based Model for Traffic Situation Prediction in Terminal Areas
    Yu, Zhou
    Shi, Xingyu
    Zhang, Zhaoning
    IEEE ACCESS, 2023, 11 : 16156 - 16165