High entropy alloy property predictions using a transformer-based language model

被引:0
|
作者
Spyros Kamnis [1 ]
Konstantinos Delibasis [2 ]
机构
[1] University of Thessaly,Department of Computer Science and Biomedical Informatics
[2] Castolin Eutectic-Monitor Coatings Ltd.,undefined
关键词
High entropy alloys; Language models; Materials; Design; Machine learning;
D O I
10.1038/s41598-025-95170-z
中图分类号
学科分类号
摘要
This study introduces a language transformer-based machine learning model to predict key mechanical properties of high-entropy alloys (HEAs), addressing the challenges due to their complex, multi-principal element compositions and limited experimental data. By pre-training the transformer on extensive synthetic materials data and fine-tuning it with specific HEA datasets, the model effectively captures intricate elemental interactions through self-attention mechanisms. This approach mitigates data scarcity issues via transfer learning, enhancing predictive accuracy for properties like elongation (%) and ultimate tensile strength compared to traditional regression models such as random forests and Gaussian processes. The model’s interpretability is enhanced by visualizing attention weights, revealing significant elemental relationships that align with known metallurgical principles. This work demonstrates the potential of transformer models to accelerate materials discovery and optimization, enabling accurate property predictions, thereby advancing the field of materials informatics. To fully realize the model’s potential in practical applications, future studies should incorporate more advanced preprocessing methods, realistic constraints during synthetic dataset generation, and more refined tokenization techniques.
引用
收藏
相关论文
共 50 条
  • [21] Development of a Text Classification Framework using Transformer-based Embeddings
    Yeasmin, Sumona
    Afrin, Nazia
    Saif, Kashfia
    Huq, Mohammad Rezwanul
    PROCEEDINGS OF THE 11TH INTERNATIONAL CONFERENCE ON DATA SCIENCE, TECHNOLOGY AND APPLICATIONS (DATA), 2022, : 74 - 82
  • [22] Transformer-Based Generative Model Accelerating the Development of Novel BRAF Inhibitors
    Yang, Lijuan
    Yang, Guanghui
    Bing, Zhitong
    Tian, Yuan
    Niu, Yuzhen
    Huang, Liang
    Yang, Lei
    ACS OMEGA, 2021, 6 (49): : 33864 - 33873
  • [23] A transformer-based deep learning model for Persian moral sentiment analysis
    Karami, Behnam
    Bakouie, Fatemeh
    Gharibzadeh, Shahriar
    JOURNAL OF INFORMATION SCIENCE, 2023,
  • [24] TRACE: Transformer-based continuous tracking framework using IoT and MCS
    Mohammed, Shahmir Khan
    Singh, Shakti
    Mizouni, Rabeb
    Otrok, Hadi
    JOURNAL OF NETWORK AND COMPUTER APPLICATIONS, 2024, 222
  • [25] Towards a Transformer-Based Pre-trained Model for IoT Traffic Classification
    Bazaluk, Bruna
    Hamdan, Mosab
    Ghaleb, Mustafa
    Gismalla, Mohammed S. M.
    da Silva, Flavio S. Correa
    Batista, Daniel Macedo
    PROCEEDINGS OF 2024 IEEE/IFIP NETWORK OPERATIONS AND MANAGEMENT SYMPOSIUM, NOMS 2024, 2024,
  • [26] Microstructure and mechanical property of a novel ReMoTaW high-entropy alloy with high density
    Wei, Qinqin
    Shen, Qiang
    Zhang, Jian
    Chen, Ben
    Luo, Guoqiang
    Zhang, Lianmeng
    INTERNATIONAL JOURNAL OF REFRACTORY METALS & HARD MATERIALS, 2018, 77 : 8 - 11
  • [27] Sentiment Mining in E-Commerce: The Transformer-based Deep Learning Model
    Alsaedi, Tahani
    Nawaz, Asif
    Alahmadi, Abdulrahman
    Rana, Muhammad Rizwan Rashid
    Raza, Ammar
    INTERNATIONAL JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING SYSTEMS, 2024, 15 (08) : 641 - 650
  • [28] Romanian Fake News Detection Using Machine Learning and Transformer-Based Approaches
    Moisi, Elisa Valentina
    Mihalca, Bogdan Cornel
    Coman, Simina Maria
    Pater, Alexandrina Mirela
    Popescu, Daniela Elena
    APPLIED SCIENCES-BASEL, 2024, 14 (24):
  • [29] Identification of Intra-Domain Ambiguity using Transformer-based Machine Learning
    Moharil, Ambarish
    Sharma, Arpit
    2022 IEEE/ACM 1ST INTERNATIONAL WORKSHOP ON NATURAL LANGUAGE-BASED SOFTWARE ENGINEERING (NLBSE 2022), 2022, : 51 - 58
  • [30] In-Context Learning for MIMO Equalization Using Transformer-Based Sequence Models
    Zecchin, Matteo
    Yu, Kai
    Simeone, Osvaldo
    2024 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS, ICC WORKSHOPS 2024, 2024, : 1573 - 1578