BERT Probe: A python']python package for probing attention based robustness evaluation of BERT models

被引:2
作者
Khan, Shahrukh [1 ]
Shahid, Mahnoor [1 ]
Singh, Navdeeppal [1 ]
机构
[1] Saarland Univ, Saarbrucken, Germany
关键词
Deep learning; BERT; Transformers; Adversarial machine learning;
D O I
10.1016/j.simpa.2022.100310
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Transformer models based on attention-based architectures have been significantly successful in establishing state-of-the-art results in natural language processing (NLP). However, recent work about adversarial robustness of attention-based models show that their robustness is susceptible to adversarial inputs causing spurious outputs thereby raising questions about trustworthiness of such models. In this paper, we present BERT Probe which is a python-based package for evaluating robustness to attention attribution based on character-level and word-level evasion attacks and empirically quantifying potential vulnerabilities for sequence classification tasks. Additionally, BERT Probe also provides two out-of-the-box defenses against character-level attention attribution-based evasion attacks.
引用
收藏
页数:3
相关论文
共 50 条
[21]   IAN-BERT: Combining Post-trained BERT with Interactive Attention Network for Aspect-Based Sentiment Analysis [J].
Verma S. ;
Kumar A. ;
Sharan A. .
SN Computer Science, 4 (6)
[22]   Sentiment classification of microblog: A framework based on BERT and CNN with attention mechanism [J].
Jia, Keliang .
COMPUTERS & ELECTRICAL ENGINEERING, 2022, 101
[23]   A Multiscale Interactive Attention Short Text Classification Model Based on BERT [J].
Zhou, Lu ;
Wang, Peng ;
Zhang, Huijun ;
Wu, Shengbo ;
Zhang, Tao .
IEEE ACCESS, 2024, 12 :160992-161001
[24]   A multi-facet analysis of BERT-based entity matching models [J].
Paganelli, Matteo ;
Tiano, Donato ;
Guerra, Francesco .
VLDB JOURNAL, 2024, 33 (04) :1039-1064
[25]   A Comparative Study on Pre-Trained Models Based on BERT [J].
Zhang, Minghua .
2024 6TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING, ICNLP 2024, 2024, :326-330
[26]   Research on Image Perception of Tourist Destinations Based on the BERT-BiLSTM-CNN-Attention Model [J].
Wen, Tingxin ;
Xu, Xinyu .
SUSTAINABILITY, 2024, 16 (08)
[27]   A BERT-based Approach with Relation-aware Attention for Knowledge Base Question Answering [J].
Luo, Da ;
Su, Jindian ;
Yu, Shanshan .
2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
[28]   TooT-BERT-C: A study on discriminating ion channels from membrane proteins based on the primary sequence's contextual representation from BERT models [J].
Ghazikhani, Hamed ;
Butler, Gregory .
2022 9TH INTERNATIONAL CONFERENCE ON BIOINFORMATICS RESEARCH AND APPLICATIONS, ICBRA 2022, 2022, :23-29
[29]   BERT-Based Dialogue Evaluation Methods with RUBER Framework [J].
Htar, Khin Thet ;
Wang, Yanan ;
Wu, Jianming ;
Hattori, Gen ;
Thida, Aye .
ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 1357 :133-144
[30]   A systematic procedure for the analysis of maintenance reports based on a taxonomy and BERT attention mechanism [J].
Valcamonico, Dario ;
Baraldi, Piero ;
Macedo, July Bias ;
Moura, Marcio Das Chagas ;
Brown, Jonathan ;
Gauthier, Stephane ;
Zio, Enrico .
RELIABILITY ENGINEERING & SYSTEM SAFETY, 2025, 257