Introducing pre-trained transformers for high entropy alloy informatics

被引:0
|
作者
Kamnis, Spyros [1 ]
机构
[1] Castolin Eutectic UK, Newcastle upon Tyne,NE29 8SE, United Kingdom
关键词
Compendex;
D O I
暂无
中图分类号
学科分类号
摘要
Machine learning
引用
收藏
相关论文
共 33 条
  • [1] Leveraging Pre-trained Checkpoints for Sequence Generation Tasks
    Rothe, Sascha
    Narayan, Shashi
    Severyn, Aliaksei
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2020, 8 : 264 - 280
  • [2] Removing Backdoors in Pre-trained Models by Regularized Continual Pre-training
    Zhu, Biru
    Cui, Ganqu
    Chen, Yangyi
    Qin, Yujia
    Yuan, Lifan
    Fu, Chong
    Deng, Yangdong
    Liu, Zhiyuan
    Sun, Maosong
    Gu, Ming
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2023, 11 : 1608 - 1623
  • [3] Classification of Regional Food Using Pre-Trained Transfer Learning Models
    Gadhiya, Jeet
    Khatik, Anjali
    Kodinariya, Shruti
    Ramoliya, Dipak
    7th International Conference on Electronics, Communication and Aerospace Technology, ICECA 2023 - Proceedings, 2023, : 1237 - 1241
  • [4] Revisiting Pre-trained Language Models and their Evaluation for Arabic Natural Language Understanding
    Ghaddar, Abbas
    Wu, Yimeng
    Bagga, Sunyam
    Rashid, Ahmad
    Bibi, Khalil
    Rezagholizadeh, Mehdi
    Xing, Chao
    Wang, Yasheng
    Xinyu, Duan
    Wang, Zhefeng
    Huai, Baoxing
    Jiang, Xin
    Liu, Qun
    Langlais, Philippe
    arXiv, 2022,
  • [5] Style Change Detection: Method Based On Pre-trained Model And Similarity Recognition
    Foshan University, Foshan, China
    CEUR Workshop Proc., (2526-2531):
  • [6] Revisiting Pre-trained Language Models and their Evaluation for Arabic Natural Language Processing
    Huawei Technologies Co., Ltd.
    不详
    不详
    Proc. Conf. Empir. Methods Nat. Lang. Process., EMNLP, (3135-3151):
  • [7] Microstructure segmentation with deep learning encoders pre-trained on a large microscopy dataset
    Stuckner, Joshua
    Harder, Bryan
    Smith, Timothy M.
    NPJ COMPUTATIONAL MATERIALS, 2022, 8 (01)
  • [8] Towards JavaScript program repair with Generative Pre-trained Transformer (GPT-2)
    Lajko, Mark
    Csuvik, Viktor
    Vidacs, Laszlo
    Proceedings - International Workshop on Automated Program Repair, APR 2022, 2022, : 61 - 68
  • [9] Are Pre-trained Language Models Useful for Model Ensemble in Chinese Grammatical Error Correction?
    Tang, Chenming
    Wu, Xiuyu
    Wu, Yunfang
    arXiv, 2023,
  • [10] BERT-Log: Anomaly Detection for System Logs Based on Pre-trained Language Model
    Chen, Song
    Liao, Hai
    APPLIED ARTIFICIAL INTELLIGENCE, 2022, 36 (01)