Riding feeling recognition based on multi-head self-attention LSTM for driverless automobile

被引:1
作者
Tang, Xianzhi [1 ]
Xie, Yongjia [1 ]
Li, Xinlong [1 ]
Wang, Bo [1 ]
机构
[1] Yanshan Univ, Sch Vehicles & Energy, Hebei Key Lab Special Carrier Equipment, Hebei St, Qinhuangdao 066004, Hebei, Peoples R China
基金
中国国家自然科学基金;
关键词
Electroencephalography (EEG); Attention; Feature extraction; Driving experience;
D O I
10.1016/j.patcog.2024.111135
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the emergence of driverless technology, passenger ride comfort has become an issue of concern. In recent years, driving fatigue detection and braking sensation evaluation based on EEG signals have received more attention, and analyzing ride comfort using EEG signals is also a more intuitive method. However, it is still a challenge to find an effective method or model to evaluate passenger comfort. In this paper, we propose a longand short-term memory network model based on a multiple self-attention mechanism for passenger comfort detection. By applying the multiple attention mechanism to the feature extraction process, more efficient classification results are obtained. The results show that the long- and short-term memory network using the multihead self-attention mechanism is efficient in decision making along with higher classification accuracy. In conclusion, the classifier based on the multi-head attention mechanism proposed in this paper has excellent performance in EEG classification of different emotional states, and has a broad development prospect in braincomputer interaction.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] Multimodal sentiment analysis based on multi-head attention mechanism
    Xi, Chen
    Lu, Guanming
    Yan, Jingjie
    ICMLSC 2020: PROCEEDINGS OF THE 4TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND SOFT COMPUTING, 2020, : 34 - 39
  • [22] A multi-head adjacent attention-based pyramid layered model for nested named entity recognition
    Cui, Shengmin
    Joe, Inwhee
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (03) : 2561 - 2574
  • [23] Hierarchical Multi-Task Learning Based on Interactive Multi-Head Attention Feature Fusion for Speech Depression Recognition
    Xing, Yujuan
    He, Ruifang
    Zhang, Chengwen
    Tan, Ping
    IEEE ACCESS, 2025, 13 : 51208 - 51219
  • [24] A multi-head adjacent attention-based pyramid layered model for nested named entity recognition
    Shengmin Cui
    Inwhee Joe
    Neural Computing and Applications, 2023, 35 : 2561 - 2574
  • [25] IS CROSS-ATTENTION PREFERABLE TO SELF-ATTENTION FOR MULTI-MODAL EMOTION RECOGNITION?
    Rajan, Vandana
    Brutti, Alessio
    Cavallaro, Andrea
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4693 - 4697
  • [26] Pedestrian Attribute Recognition Based on Dual Self-attention Mechanism
    Fan, Zhongkui
    Guan, Ye-peng
    COMPUTER SCIENCE AND INFORMATION SYSTEMS, 2023, 20 (02) : 793 - 812
  • [27] The Multimodal Scene Recognition Method Based on Self-Attention and Distillation
    Sun, Ning
    Xu, Wei
    Liu, Jixin
    Chai, Lei
    Sun, Haian
    IEEE MULTIMEDIA, 2024, 31 (04) : 25 - 36
  • [28] GFA-SMT: Geometric Feature Aggregation and Self-Attention in a Multi-Head Transformer for 3D Object Detection in Autonomous Vehicles
    Mushtaq, Husnain
    Deng, Xiaoheng
    Jiang, Ping
    Wan, Shaohua
    Ali, Mubashir
    Ullah, Irshad
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2025, 26 (03) : 3557 - 3573
  • [29] LSTM-based multi-layer self-attention method for remaining useful life estimation of mechanical systems
    Xia, Jun
    Feng, Yunwen
    Lu, Cheng
    Fei, Chengwei
    Xue, Xiaofeng
    ENGINEERING FAILURE ANALYSIS, 2021, 125
  • [30] Human Activity Recognition Based on Self-Attention Mechanism in WiFi Environment
    Ge, Fei
    Yang, Zhimin
    Dai, Zhenyang
    Tan, Liansheng
    Hu, Jianyuan
    Li, Jiayuan
    Qiu, Han
    IEEE ACCESS, 2024, 12 : 85231 - 85243