Debunking Fake News by Leveraging Speaker Credibility and BERT Based Model

被引:0
|
作者
Singh, Thoudam Doren [1 ]
Divyansha [1 ]
Singh, Apoorva Vikram [2 ]
Sachan, Anubhav [3 ]
Khilji, Abdullah Faiz Ur Rahman [1 ]
机构
[1] Natl Inst Technol, Dept Comp Sci & Engn, Silchar, Silchar, India
[2] Natl Inst Technol, Dept Elect Engn, Silchar, Silchar, India
[3] Natl Inst Technol, Dept Elect & Commun Engn, Silchar, Silchar, India
来源
2020 IEEE/WIC/ACM INTERNATIONAL JOINT CONFERENCE ON WEB INTELLIGENCE AND INTELLIGENT AGENT TECHNOLOGY (WI-IAT 2020) | 2020年
关键词
Fake News; Text Classitication; BERT; LIAR;
D O I
10.1109/WIIAT50758.2020.00147
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The exponential growth in fake news and its role in deteriorating general public trust and democratic standards certainly calls for some counter combat approaches. The prediction of chalices of news to be fake is deemed to be hard task since most of the deceptive news has its roots in true news. With a minor fabrication in legitimate news, influential fake news can be created that can be used for political, entertainment, or business-related gains. This work provides a novel intuitive approach to exploit data from multiple sources to segregate news into real and fake. To efficiently capture the contextual information present in the data, Bidirectional Encoder Representations from Transformer (BERT) have been deployed. It attempts to further enhance the performance of the deceptive news detection model by incorporating information about the speaker profile and the credibility associated with him/her. A hybrid sequence encoding model has been proposed to harvest the speaker profile and speaker credibility data which makes it useful for prediction. On evaluation over benchmark fake news dataset LIAR, our model outperformed the previous state-of-the-art works. This attests to the fact that the speaker's profile and credibility play a crucial role in predicting the validity of news.
引用
收藏
页码:960 / 968
页数:9
相关论文
共 50 条
  • [1] Fake News Detection Using BERT Model with Joint Learning
    Wesam Shishah
    Arabian Journal for Science and Engineering, 2021, 46 : 9115 - 9127
  • [2] Fake News Detection Using BERT Model with Joint Learning
    Shishah, Wesam
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2021, 46 (09) : 9115 - 9127
  • [3] Fake news detection based on a hybrid BERT and LightGBM models
    Essa, Ehab
    Omar, Karima
    Alqahtani, Ali
    COMPLEX & INTELLIGENT SYSTEMS, 2023, 9 (06) : 6581 - 6592
  • [4] Fake news detection based on a hybrid BERT and LightGBM models
    Ehab Essa
    Karima Omar
    Ali Alqahtani
    Complex & Intelligent Systems, 2023, 9 : 6581 - 6592
  • [5] Leveraging Socio-contextual Information in BERT for Fake Health News Detection in Social Media
    Upadhyay, Rishabh
    Pasi, Gabriella
    Viviani, Marco
    PROCEEDINGS OF THE 2023 WORKSHOP ON OPEN CHALLENGES IN ONLINE SOCIAL NETWORKS, OASIS 2023/ 34TH ACM CONFERENCE ON HYPERTEXT AND SOCIAL MEDIA, HT 2023, 2023, : 38 - 46
  • [6] Fake News Detection System, based on CBOW and BERT
    Vo, Trung Hung
    Felde, Imre
    Ninh, Khanh Chi
    ACTA POLYTECHNICA HUNGARICA, 2025, 22 (01) : 27 - 41
  • [7] Smart Edge-based Fake News Detection using Pre-trained BERT Model
    Guo, Yuhang
    Lamaazi, Hanane
    Mizouni, Rabeb
    2022 18TH INTERNATIONAL CONFERENCE ON WIRELESS AND MOBILE COMPUTING, NETWORKING AND COMMUNICATIONS (WIMOB), 2022,
  • [8] Investigating the Difference of Fake News Source Credibility Recognition between ANN and BERT Algorithms in Artificial Intelligence
    Chiang, Tosti H. C.
    Liao, Chih-Shan
    Wang, Wei-Ching
    APPLIED SCIENCES-BASEL, 2022, 12 (15):
  • [9] A BERT-Based Semantic Enhanced Model for COVID-19 Fake News Detection
    Yin, Hui
    Liu, Xiao
    Wu, Yutao
    Aria, Hilya Mudrika
    Mohawesh, Rami
    WEB AND BIG DATA, PT I, APWEB-WAIM 2023, 2024, 14331 : 1 - 15
  • [10] Fake News Detection Using Enhanced BERT
    Aljawarneh, Shadi A.
    Swedat, Safa Ahmad
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, 11 (04): : 4843 - 4850