Steel design based on a large language model

被引:2
|
作者
Tian, Shaohan [1 ]
Jiang, Xue [1 ,2 ]
Wang, Weiren [1 ]
Jing, Zhihua [1 ]
Zhang, Chi [1 ]
Zhang, Cheng [1 ]
Lookman, Turab [3 ]
Su, Yanjing [1 ]
机构
[1] Univ Sci & Technol Beijing, Inst Adv Mat & Technol, Beijing Adv Innovat Ctr Mat Genome Engn, Beijing 100083, Peoples R China
[2] Liaoning Acad Mat, Shenyang 110000, Liaoning, Peoples R China
[3] AiMaterials Res LLC, Santa Fe, NM 87501 USA
基金
中国国家自然科学基金;
关键词
Property prediction; Steel design; Materials language model; Deep learning; Artificial intelligence; MACHINE; STRENGTH;
D O I
10.1016/j.actamat.2024.120663
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The success of artificial intelligence (AI) in materials research heavily relies on the integrity of structured data and the construction of precise descriptors. In this study, we present an end-to-end pipeline from materials text to properties for steels based on a large language model. The objective is to enable quantitative predictions of properties with high-accuracy and explore new steels. The pipeline includes a materials language encoder, named SteelBERT, and a multimodal deep learning framework that maps the composition and text sequence of complex fabrication processes to mechanical properties. We demonstrate high accuracy on mechanical properties, including yield strength (YS), ultimate tensile strength (UTS), and elongation (EL) by predicting determination coefficients (R2) reaching 78.17 % ( f 3.40 %), 82.56 % ( f 1.96 %), and 81.44 % ( f 2.98 %) respectively. Further, through an additional fine-tuning strategy for the design of specific steels with small datasets, we show how the performance can be refined. With only 64 experimental samples of 15Cr austenitic stainless steels, we obtain an optimized model with R2 of 89.85 % ( f 6.17 %), 88.34 % ( f 5.95 %) and 87.24 % ( f 5.15 %) for YS, UTS and EL, that requires the user to input composition and text sequence for processing and which outputs mechanical properties. The model efficiently optimizes the text sequence for the fabrication process by suggesting a secondary round of cold rolling and tempering to yield an exceptional YS of 960 MPa, UTS of 1138 MPa, and EL of 32.5 %, exceeding those of reported 15Cr austenitic stainless steels.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Elicitron: A Large Language Model Agent-Based Simulation Framework for Design Requirements Elicitation
    Ataei, Mohammadmehdi
    Cheong, Hyunmin
    Grandi, Daniele
    Wang, Ye
    Morris, Nigel
    Tessier, Alexander
    JOURNAL OF COMPUTING AND INFORMATION SCIENCE IN ENGINEERING, 2025, 25 (02)
  • [2] Understanding the Impact of Applying Large Language Model in Engineering Design Education
    Zhang, Chonghui
    Zhao, Yaoyao Fiona
    El Haddad, Randous
    JOURNAL OF COMPUTING AND INFORMATION SCIENCE IN ENGINEERING, 2025, 25 (02)
  • [3] A chatbot based question and answer system for the auxiliary diagnosis of chronic diseases based on large language model
    Zhang, Sainan
    Song, Jisung
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [4] Design of a large language model for improving customer service in telecom operators
    Ma, Xiaoliang
    Zhao, RuQiang
    Liu, Ying
    Deng, Congjian
    Du, Dequan
    ELECTRONICS LETTERS, 2024, 60 (10)
  • [5] Large Language Model-Based Chatbots in Higher Education
    Yigci, Defne
    Eryilmaz, Merve
    Yetisen, Ail K.
    Tasoglu, Savas
    Ozcan, Aydogan
    ADVANCED INTELLIGENT SYSTEMS, 2025, 7 (03)
  • [6] Large language model in electrocatalysis
    Zhang, Chengyi
    Wang, Xingyu
    Wang, Ziyun
    CHINESE JOURNAL OF CATALYSIS, 2024, 59 : 7 - 14
  • [7] A Large Language Model Agent Based Legal Assistant for Governance Applications
    Mamalis, Marios Evangelos
    Kalampokis, Evangelos
    Fitsilis, Fotios
    Theodorakopoulos, Georgios
    Tarabanis, Konstantinos
    ELECTRONIC GOVERNMENT, EGOV 2024, 2024, 14841 : 286 - 301
  • [8] A Survey: Collaborative Hardware and Software Design in the Era of Large Language Models
    Guo, Cong
    Cheng, Feng
    Du, Zhixu
    Kiessling, James
    Ku, Jonathan
    Li, Shiyu
    Li, Ziru
    Ma, Mingyuan
    Molom-Ochir, Tergel
    Morris, Benjamin
    Shan, Haoxuan
    Sun, Jingwei
    Wang, Yitu
    Wei, Chiyue
    Wu, Xueying
    Wu, Yuhao
    Yang, Hao Frank
    Zhang, Jingyang
    Zhang, Junyao
    Zheng, Qilin
    Zhou, Guanglei
    Li, Hai
    Chen, Yiran
    IEEE CIRCUITS AND SYSTEMS MAGAZINE, 2025, 25 (01) : 35 - 57
  • [9] Creating a large language model of a philosopher
    Schwitzgebel, Eric
    Schwitzgebel, David
    Strasser, Anna
    MIND & LANGUAGE, 2024, 39 (02) : 237 - 259
  • [10] FALL: Prior Failure Detection in Large Scale System Based on Language Model
    Jeong, Jaeyoon
    Baek, Insung
    Bang, Byungwoo
    Lee, Junyeon
    Song, Uiseok
    Kim, Seoung Bum
    IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2025, 22 (01) : 279 - 291