High entropy alloy property predictions using a transformer-based language model

被引:0
|
作者
Spyros Kamnis [1 ]
Konstantinos Delibasis [2 ]
机构
[1] University of Thessaly,Department of Computer Science and Biomedical Informatics
[2] Castolin Eutectic-Monitor Coatings Ltd.,undefined
关键词
High entropy alloys; Language models; Materials; Design; Machine learning;
D O I
10.1038/s41598-025-95170-z
中图分类号
学科分类号
摘要
This study introduces a language transformer-based machine learning model to predict key mechanical properties of high-entropy alloys (HEAs), addressing the challenges due to their complex, multi-principal element compositions and limited experimental data. By pre-training the transformer on extensive synthetic materials data and fine-tuning it with specific HEA datasets, the model effectively captures intricate elemental interactions through self-attention mechanisms. This approach mitigates data scarcity issues via transfer learning, enhancing predictive accuracy for properties like elongation (%) and ultimate tensile strength compared to traditional regression models such as random forests and Gaussian processes. The model’s interpretability is enhanced by visualizing attention weights, revealing significant elemental relationships that align with known metallurgical principles. This work demonstrates the potential of transformer models to accelerate materials discovery and optimization, enabling accurate property predictions, thereby advancing the field of materials informatics. To fully realize the model’s potential in practical applications, future studies should incorporate more advanced preprocessing methods, realistic constraints during synthetic dataset generation, and more refined tokenization techniques.
引用
收藏
相关论文
共 50 条
  • [1] LVBERT: Transformer-Based Model for Latvian Language Understanding
    Znotins, Arturs
    Barzdins, Guntis
    HUMAN LANGUAGE TECHNOLOGIES - THE BALTIC PERSPECTIVE (HLT 2020), 2020, 328 : 111 - 115
  • [2] ParsBERT: Transformer-based Model for Persian Language Understanding
    Mehrdad Farahani
    Mohammad Gharachorloo
    Marzieh Farahani
    Mohammad Manthouri
    Neural Processing Letters, 2021, 53 : 3831 - 3847
  • [3] ParsBERT: Transformer-based Model for Persian Language Understanding
    Farahani, Mehrdad
    Gharachorloo, Mohammad
    Farahani, Marzieh
    Manthouri, Mohammad
    NEURAL PROCESSING LETTERS, 2021, 53 (06) : 3831 - 3847
  • [4] Molecular Descriptors Property Prediction Using Transformer-Based Approach
    Tran, Tuan
    Ekenna, Chinwe
    INTERNATIONAL JOURNAL OF MOLECULAR SCIENCES, 2023, 24 (15)
  • [5] Direct conversion of peptides into diverse peptidomimetics using a transformer-based chemical language model
    Yoshimori, Atsushi
    Bajorath, Juergen
    EUROPEAN JOURNAL OF MEDICINAL CHEMISTRY REPORTS, 2025, 13
  • [6] Automatic text summarization using transformer-based language models
    Rao, Ritika
    Sharma, Sourabh
    Malik, Nitin
    INTERNATIONAL JOURNAL OF SYSTEM ASSURANCE ENGINEERING AND MANAGEMENT, 2024, 15 (06) : 2599 - 2605
  • [7] Transformer-Based Composite Language Models for Text Evaluation and Classification
    Skoric, Mihailo
    Utvic, Milos
    Stankovic, Ranka
    MATHEMATICS, 2023, 11 (22)
  • [8] AI-Assisted Text Composition for Automated Content Authoring Using Transformer-Based Language Models
    Alpdemir, Yusuf
    Alpdemir, Mahmut Nedim
    2024 IEEE INTERNATIONAL CONFERENCE ON ADVANCED SYSTEMS AND EMERGENT TECHNOLOGIES, ICASET 2024, 2024,
  • [9] Public Sentiment toward Solar Energy-Opinion Mining of Twitter Using a Transformer-Based Language Model
    Kim, Serena Y.
    Ganesan, Koushik
    Dickens, Princess
    Panda, Soumya
    SUSTAINABILITY, 2021, 13 (05) : 1 - 19
  • [10] Transformer-Based Model for Electrical Load Forecasting
    L'Heureux, Alexandra
    Grolinger, Katarina
    Capretz, Miriam A. M.
    ENERGIES, 2022, 15 (14)