Exploiting the Self-Attention Mechanism in Gas Sensor Array (GSA) Data With Neural Networks

被引:2
|
作者
Wang, Ningning [1 ]
Li, Silong [1 ]
Ye, Terry Tao [1 ]
机构
[1] Southern Univ Sci & Technol, Dept Elect & Elect Engn, Shenzhen 518055, Peoples R China
关键词
Sensors; Sensor arrays; Gases; Gas detectors; Quantization (signal); Feature extraction; Sensor phenomena and characterization; Gas classification; gas sensor array (GSA); long short-term memory (LSTM); self-attention mechanism; MACHINE OLFACTION; DISCRIMINATION;
D O I
10.1109/JSEN.2023.3240470
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Gas sensor array (GSA) data is a sequential series of values that represents the temporal conditions of the existence/absence/mixture of gases and exhibits similarities to the textual stream of natural languages that represents semantic information. We speculate and subsequently prove that there also exist self-attention mechanisms in GSA data that can be exploited for gas classification and recognition. We first convert GSA data into a 1-D token series (called WORDs in this work) through sampling and quantization of the sensor values and then use an enhanced long short-term memory (LSTM) revision network, called LSTM-attention, to extract the self-attention mechanism in the GSA data. We demonstrate that LSTM-attention achieves a much better performance (99.6%) than CNN-based networks as well as other GSA data process techniques on UCI dynamic gases dataset. We also find out that the self-attention mechanism varies with different sampling and quantization levels during data acquisition.
引用
收藏
页码:5988 / 5996
页数:9
相关论文
共 50 条
  • [41] Modeling Localness for Self-Attention Networks
    Yang, Baosong
    Tu, Zhaopeng
    Wong, Derek F.
    Meng, Fandong
    Chao, Lidia S.
    Zhang, Tong
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4449 - 4458
  • [42] Graph convolutional networks with the self-attention mechanism for adaptive influence maximization in social networks
    Tang, Jianxin
    Song, Shihui
    Du, Qian
    Yao, Yabing
    Qu, Jitao
    COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (06) : 8383 - 8401
  • [43] Magnetotelluric Data Inversion Based on Deep Learning With the Self-Attention Mechanism
    Xu, Kaijun
    Liang, Shuyuan
    Lu, Yan
    Hu, Zuzhi
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [44] Leveraging contextual embeddings and self-attention neural networks with bi-attention for sentiment analysis
    Magdalena Biesialska
    Katarzyna Biesialska
    Henryk Rybinski
    Journal of Intelligent Information Systems, 2021, 57 : 601 - 626
  • [45] ELSA: Hardware-Software Co-design for Efficient, Lightweight Self-Attention Mechanism in Neural Networks
    Ham, Tae Jun
    Lee, Yejin
    Seo, Seong Hoon
    Kim, Soosung
    Choi, Hyunji
    Jung, Sung Jun
    Lee, Jae W.
    2021 ACM/IEEE 48TH ANNUAL INTERNATIONAL SYMPOSIUM ON COMPUTER ARCHITECTURE (ISCA 2021), 2021, : 692 - 705
  • [46] Self-Attention Networks For Motion Posture Recognition Based On Data Fusion
    Ji, Zhihao
    Xie, Qiang
    4TH INTERNATIONAL CONFERENCE ON INFORMATICS ENGINEERING AND INFORMATION SCIENCE (ICIEIS2021), 2022, 12161
  • [47] Leveraging contextual embeddings and self-attention neural networks with bi-attention for sentiment analysis
    Biesialska, Magdalena
    Biesialska, Katarzyna
    Rybinski, Henryk
    JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2021, 57 (03) : 601 - 626
  • [48] Prediction of Sea Surface Temperature by Combining Interdimensional and Self-Attention with Neural Networks
    Guo, Xing
    He, Jianghai
    Wang, Biao
    Wu, Jiaji
    REMOTE SENSING, 2022, 14 (19)
  • [49] Generating self-attention activation maps for visual interpretations of convolutional neural networks
    Liang, Yu
    Li, Maozhen
    Jiang, Changjun
    NEUROCOMPUTING, 2022, 490 : 206 - 216
  • [50] Recurrent Neural Network Model with Self-Attention Mechanism for Fault Detection and Diagnosis
    Zhang, Rui
    Xiong, Zhihua
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 4706 - 4711