A novel self-attention model based on cosine self-similarity for cancer classification of protein mass spectrometry

被引:4
作者
Tang, Long [1 ]
Xu, Ping [1 ]
Xue, Lingyun [1 ]
Liu, Yian [1 ]
Yan, Ming [1 ]
Chen, Anqi [2 ]
Hu, Shundi [2 ]
Wen, Luhong [2 ,3 ]
机构
[1] Hangzhou Dianzi Univ, Coll Automat, Hangzhou 310028, Peoples R China
[2] Ningbo Univ, Res Inst Adv Technol, Ningbo 315211, Peoples R China
[3] China Innovat Instrument Co Ltd, Ningbo 315000, Peoples R China
关键词
Mass spectrometry; Cosine self-similarity; Cancer classification; Deep learning; PROSTATE-CANCER; PROTEOMICS;
D O I
10.1016/j.ijms.2023.117131
中图分类号
O64 [物理化学(理论化学)、化学物理学]; O56 [分子物理学、原子物理学];
学科分类号
070203 ; 070304 ; 081704 ; 1406 ;
摘要
Mass spectrometry has become a popular tool for cancer classification. A novel self-attention deep learning model based on cosine self-similarity was proposed to classify cancer by mass spectrometry. First, a primary feature vector is dimensionally reduced by two fully connected layers. Second, the feature vector is transformed into the 2D feature matrix, which can be used to calculate the cosine self-similarity matrix of the self-attention model. Next, three convolutional layers are used to extract the refined feature matrix. Finally, the refined feature matrix is fed into the multi-layer fully-connected network to classify the mass spectra. Experimental results of ovarian and prostate cancer demonstrate that the proposed method outperforms the other methods.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Web service classification based on self-attention mechanism
    Jia, Zhichun
    Zhang, Zhiying
    Dong, Rui
    Yang, Zhongxuan
    Xing, Xing
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 2164 - 2169
  • [2] Image classification model based on large kernel attention mechanism and relative position self-attention mechanism
    Liu S.
    Wei J.
    Liu G.
    Zhou B.
    PeerJ Computer Science, 2023, 9
  • [3] Image classification model based on large kernel attention mechanism and relative position self-attention mechanism
    Liu, Siqi
    Wei, Jiangshu
    Liu, Gang
    Zhou, Bei
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [4] HiFun: homology independent protein function prediction by a novel protein-language self-attention model
    Wu, Jun
    Qing, Haipeng
    Ouyang, Jian
    Zhou, Jiajia
    Gao, Zihao
    Mason, Christopher E.
    Liu, Zhichao
    Shi, Tieliu
    BRIEFINGS IN BIOINFORMATICS, 2023, 24 (05)
  • [5] Self-Attention Mechanisms-Based Laryngoscopy Image Classification Technique for Laryngeal Cancer Detection
    Kang, Yi-Fan
    Yang, Lie
    Hu, Yi-Fan
    Xu, Kai
    Cai, Lan-Jun
    Hu, Bin-Bin
    Lu, Xiang
    HEAD AND NECK-JOURNAL FOR THE SCIENCES AND SPECIALTIES OF THE HEAD AND NECK, 2025, 47 (03): : 944 - 955
  • [6] In-depth Recommendation Model Based on Self-Attention Factorization
    Ma, Hongshuang
    Liu, Qicheng
    KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2023, 17 (03): : 721 - 739
  • [7] Deep Alternate Kernel Fused Self-Attention Model-Based Lung Nodule Classification
    Saritha, R. Rani
    Sangeetha, V.
    JOURNAL OF ADVANCES IN INFORMATION TECHNOLOGY, 2024, 15 (11) : 1242 - 1251
  • [8] Fake news detection and classification using hybrid BiLSTM and self-attention model
    Mohapatra, Asutosh
    Thota, Nithin
    Prakasam, P.
    MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (13) : 18503 - 18519
  • [9] Fake news detection and classification using hybrid BiLSTM and self-attention model
    Asutosh Mohapatra
    Nithin Thota
    P. Prakasam
    Multimedia Tools and Applications, 2022, 81 : 18503 - 18519
  • [10] Binary Function Similarity Detection Based on Graph Neural Network with Self-Attention Mechanism
    Wu, Dingjie
    He, Xuanzhang
    Zhang, Yao
    Zhu, Junjie
    Zhang, Xinyuan
    Ye, Minchao
    Gao, Zhigang
    2022 IEEE INTL CONF ON DEPENDABLE, AUTONOMIC AND SECURE COMPUTING, INTL CONF ON PERVASIVE INTELLIGENCE AND COMPUTING, INTL CONF ON CLOUD AND BIG DATA COMPUTING, INTL CONF ON CYBER SCIENCE AND TECHNOLOGY CONGRESS (DASC/PICOM/CBDCOM/CYBERSCITECH), 2022, : 971 - 975