Fine-Tuning Pre-Trained Model for Consumer Fraud Detection from Consumer Reviews

被引:0
|
作者
Tang, Xingli [1 ]
Li, Keqi [1 ]
Huang, Liting [1 ]
Zhou, Hui [1 ]
Ye, Chunyang [1 ]
机构
[1] Hainan Univ, Haikou, Hainan, Peoples R China
关键词
Consumer fraud detection; Consumer reviews; Regulation;
D O I
10.1007/978-3-031-39821-6_38
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Consumer fraud is a significant problem that requires accurate and prompt detection. However, existing approaches such as periodic government inspections and consumer reports are inefficient and cumbersome. This paper proposes a novel approach named CFD-BERT, to detect consumer fraud automatically based on the group intelligence from consumer reviews. By applying the correlation between consumer reviews and official regulations to accurately mine consumer fraud patterns, and fine-tuning a pretrained model BERT to better model their semantics, which can detect fraudulent behaviors. Experimental evaluations using real-world datasets confirms the effectiveness of CFD-BERT in fraud detection. To explore its potential application and usefulness in real world scenarios, an empirical study was conducted with CFD-BERT on 143,587 reviews from the last three months. The results confirmed that CFD-BERT can serve as an auxiliary tool to provide early warnings to relevant regulators and consumers.
引用
收藏
页码:451 / 456
页数:6
相关论文
共 50 条
  • [41] Virtual Data Augmentation: A Robust and General Framework for Fine-tuning Pre-trained Models
    Zhou, Kun
    Zhao, Wayne Xin
    Wang, Sirui
    Zhang, Fuzheng
    Wu, Wei
    We, Ji-Rong
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3875 - 3887
  • [42] Towards Efficient Fine-Tuning of Pre-trained Code Models: An Experimental Study and Beyond
    Shi, Ensheng
    Wang, Yanlin
    Zhang, Hongyu
    Du, Lun
    Han, Shi
    Zhang, Dongmei
    Sun, Hongbin
    PROCEEDINGS OF THE 32ND ACM SIGSOFT INTERNATIONAL SYMPOSIUM ON SOFTWARE TESTING AND ANALYSIS, ISSTA 2023, 2023, : 39 - 51
  • [43] Confounder balancing in adversarial domain adaptation for pre-trained large models fine-tuning
    Jiang, Shuoran
    Chen, Qingcai
    Xiang, Yang
    Pan, Youcheng
    Wu, Xiangping
    Lin, Yukang
    NEURAL NETWORKS, 2024, 173
  • [44] Improving Performance of Seismic Fault Detection by Fine-Tuning the Convolutional Neural Network Pre-Trained with Synthetic Samples
    Yan, Zhe
    Zhang, Zheng
    Liu, Shaoyong
    ENERGIES, 2021, 14 (12)
  • [45] Fine-tuning Pre-trained Language Models for Few-shot Intent Detection: Supervised Pre-training and Isotropization
    Zhang, Haode
    Liang, Haowen
    Zhang, Yuwei
    Zhan, Liming
    Wu, Xiao-Ming
    Lu, Xiaolei
    Lam, Albert Y. S.
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 532 - 542
  • [46] BERT4ST:: Fine-tuning pre-trained large language model for wind power forecasting
    Lai, Zefeng
    Wu, Tangjie
    Fei, Xihong
    Ling, Qiang
    ENERGY CONVERSION AND MANAGEMENT, 2024, 307
  • [47] Fine-tuning of pre-trained convolutional neural networks for diabetic retinopathy screening: a clinical study
    Roshan, Saboora M.
    Karsaz, Ali
    Vejdani, Amir Hossein
    Roshan, Yaser M.
    INTERNATIONAL JOURNAL OF COMPUTATIONAL SCIENCE AND ENGINEERING, 2020, 21 (04) : 564 - 573
  • [48] Enhancing Alzheimer's Disease Classification with Transfer Learning: Fine-tuning a Pre-trained Algorithm
    Boudi, Abdelmounim
    He, Jingfei
    Abd El Kader, Isselmou
    CURRENT MEDICAL IMAGING, 2024,
  • [49] Parameter-efficient fine-tuning of large-scale pre-trained language models
    Ning Ding
    Yujia Qin
    Guang Yang
    Fuchao Wei
    Zonghan Yang
    Yusheng Su
    Shengding Hu
    Yulin Chen
    Chi-Min Chan
    Weize Chen
    Jing Yi
    Weilin Zhao
    Xiaozhi Wang
    Zhiyuan Liu
    Hai-Tao Zheng
    Jianfei Chen
    Yang Liu
    Jie Tang
    Juanzi Li
    Maosong Sun
    Nature Machine Intelligence, 2023, 5 : 220 - 235
  • [50] Fine-tuning pre-trained neural networks for medical image classification in small clinical datasets
    Newton Spolaôr
    Huei Diana Lee
    Ana Isabel Mendes
    Conceição Veloso Nogueira
    Antonio Rafael Sabino Parmezan
    Weber Shoity Resende Takaki
    Claudio Saddy Rodrigues Coy
    Feng Chung Wu
    Rui Fonseca-Pinto
    Multimedia Tools and Applications, 2024, 83 (9) : 27305 - 27329