Enhanced Lithology Classification Using an Interpretable SHAP Model Integrating Semi-Supervised Contrastive Learning and Transformer with Well Logging Data

被引:0
|
作者
Sun, Youzhuang [1 ,2 ]
Pang, Shanchen [1 ,2 ]
Li, Hengxiao [1 ,2 ]
Qiao, Sibo [3 ]
Zhang, Yongan [1 ,2 ]
机构
[1] China Univ Petr East China, Coll Comp Sci, Qingdao, Shandong, Peoples R China
[2] China Univ Petr East China, Qingdao Coll Software, Qingdao, Shandong, Peoples R China
[3] Tiangong Univ, Coll Software, Tianjin, Peoples R China
关键词
Lithology prediction; Logging parameters; Machine learning; Transformer; Contrastive learning;
D O I
10.1007/s11053-024-10452-z
中图分类号
P [天文学、地球科学];
学科分类号
07 ;
摘要
In petroleum and natural gas exploration, lithology identification-analyzing rock types beneath the Earth's surface-is crucial for assessing hydrocarbon reservoirs and optimizing drilling strategies. Traditionally, this process relies on logging data such as gamma rays and resistivity, which often require manual interpretation, making it labor-intensive and prone to errors. To address these challenges, we propose a novel machine learning framework-contrastive learning-transformer-leveraging self-attention mechanisms to enhance the accuracy of lithology identification. Our method first extracts unlabeled samples from logging data while obtaining labeled core sample data. Through self-supervised contrastive learning and a transformer backbone network, we optimize performance using techniques like batch normalization. After pretraining, the model is fine-tuned with a limited number of labeled samples to improve accuracy and significantly reduce reliance on large labeled datasets, thereby lowering the costs associated with drilling core annotations. Additionally, our research incorporates shapley additive explanations (SHAP) technology to enhance the transparency of the model's decision-making process, facilitating the analysis of the contribution of each feature to lithology predictions. The model also learns time-reversal invariance by reversing sequential data, ensuring reliable identification even with variations in data sequences. Experimental results demonstrate that our transformer model, combined with semi-supervised contrastive learning, significantly outperforms traditional methods, achieving more precise lithology identification, especially in complex geological environments.
引用
收藏
页码:785 / 813
页数:29
相关论文
共 50 条
  • [21] Semi-supervised learning for classification of protein sequence data
    King, Brian R.
    Guda, Chittibabu
    SCIENTIFIC PROGRAMMING, 2008, 16 (01) : 5 - 29
  • [22] COMBINED UNSUPERVISED AND SEMI-SUPERVISED LEARNING FOR DATA CLASSIFICATION
    Breve, Fabricio Aparecido
    Guimaraes Pedronette, Daniel Carlos
    2016 IEEE 26TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2016,
  • [23] A Semi-Supervised Classification Method of Apicomplexan Parasites and Host Cell using Contrastive Learning Strategy
    Ren, Yanni
    Deng, Hangyu
    Jiang, Hao
    Zhu, Huilin
    Hu, Jinglu
    2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2021, : 2973 - 2978
  • [24] A Transformer-Based Contrastive Semi-Supervised Learning Framework for Automatic Modulation Recognition
    Kong, Weisi
    Jiao, Xun
    Xu, Yuhua
    Zhang, Bolin
    Yang, Qinghai
    IEEE TRANSACTIONS ON COGNITIVE COMMUNICATIONS AND NETWORKING, 2023, 9 (04) : 950 - 962
  • [25] FedFAME: A Data Augmentation Free Framework based on Model Contrastive Learning for Federated Semi-Supervised Learning
    Malaviya, Shubham
    Shukla, Manish
    Korat, Pratik
    Lodha, Sachin
    38TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2023, 2023, : 1114 - 1121
  • [26] DualGraph: Improving Semi-supervised Graph Classification via Dual Contrastive Learning
    Luo, Xiao
    Ju, Wei
    Qu, Meng
    Chen, Chong
    Deng, Minghua
    Hua, Xian-Sheng
    Zhang, Ming
    2022 IEEE 38TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2022), 2022, : 699 - 712
  • [27] Label-guided graph contrastive learning for semi-supervised node classification
    Peng, Meixin
    Juan, Xin
    Li, Zhanshan
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 239
  • [28] Self-Contrastive Learning based Semi-Supervised Radio Modulation Classification
    Liu, Dongxin
    Wang, Peng
    Wang, Tianshi
    Abdelzaher, Tarek
    2021 IEEE MILITARY COMMUNICATIONS CONFERENCE (MILCOM 2021), 2021,
  • [29] Self-Supervised Contrastive Representation Learning for Semi-Supervised Time-Series Classification
    Eldele, Emadeldeen
    Ragab, Mohamed
    Chen, Zhenghua
    Wu, Min
    Kwoh, Chee-Keong
    Li, Xiaoli
    Guan, Cuntai
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (12) : 15604 - 15618
  • [30] Multiview Semi-Supervised Learning Model for Image Classification
    Nie, Feiping
    Tian, Lai
    Wang, Rong
    Li, Xuelong
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2020, 32 (12) : 2389 - 2400