Direct conversion of peptides into diverse peptidomimetics using a transformer-based chemical language model

被引:0
|
作者
Yoshimori, Atsushi [1 ,2 ,3 ]
Bajorath, Juergen [1 ,2 ]
机构
[1] Univ Bonn, Dept Life Sci Informat & Data Sci, LIMES Program, B IT,Unit Chem Biol & Med Chem, Friedrich Hirzebruch Allee 5-6, D-53115 Bonn, Germany
[2] Univ Bonn, Lamarr Inst Machine Learning & Artificial Intellig, Friedrich Hirzebruch Allee 5-6, D-53115 Bonn, Germany
[3] Inst Theoret Med Inc, 26-1 Muraoka Higashi 2-Chome, Fujisawa, Kanagawa 2510012, Japan
来源
EUROPEAN JOURNAL OF MEDICINAL CHEMISTRY REPORTS | 2025年 / 13卷
关键词
Peptides; Peptidomimetics; Generative molecular design; Chemical language models; Peptide-to-compound mapping; PROTEIN; DESIGN; INHIBITORS; MOLECULES;
D O I
10.1016/j.ejmcr.2025.100249
中图分类号
R914 [药物化学];
学科分类号
100701 ;
摘要
The design of pharmaceutically relevant compounds that mimic bioactive peptides or secondary structure elements in proteins is an important task in medicinal chemistry. Over time, various chemical strategies have been developed to convert natural peptide ligands into so-called peptidomimetics. This process is supported by computational approaches to identify peptidomimetic candidate compounds or design templates mimicking active peptide conformations. However, generating peptidomimetics continues to be challenging. Chemical language models (CLMs) offer new opportunities for molecular design. Therefore, we have revisited computational design of peptidomimetics from a different perspective and devised a CLM to directly transform input peptides into peptidomimetic candidates, without requiring intermediate states. A critically important aspect of the approach has been the generation of training data for effective learning that was guided by a quantitative measure of peptide-likeness such that the CLM could implicitly capture transitions from peptides or peptide-like molecules to compounds with reduced or eliminated peptide character. Herein, we introduce the CLM for peptidomimetics design and establish proof-of-principle for the approach. For given input peptides, both the general model and a version fine-tuned for a specific application were shown to produce a spectrum of candidate compounds with varying similarity, gradually changing chemical features, and diminishing peptide-likeness. As a part of our study, the CLM and data are provided.
引用
收藏
页数:6
相关论文
共 41 条
  • [1] High entropy alloy property predictions using a transformer-based language model
    Spyros Kamnis
    Konstantinos Delibasis
    Scientific Reports, 15 (1)
  • [2] Transformer-Based Molecular Generative Model for Antiviral Drug Design
    Mao, Jiashun
    Wang, Jianmin
    Zeb, Amir
    Cho, Kwang-Hwi
    Jin, Haiyan
    Kim, Jongwan
    Lee, Onju
    Wang, Yunyun
    No, Kyoung Tai
    JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2023, 64 (07) : 2733 - 2745
  • [3] A transformer-based model to predict peptide-HLA class I binding and optimize mutated peptides for vaccine design
    Chu, Yanyi
    Zhang, Yan
    Wang, Qiankun
    Zhang, Lingfeng
    Wang, Xuhong
    Wang, Yanjing
    Salahub, Dennis Russell
    Xu, Qin
    Wang, Jianmin
    Jiang, Xue
    Xiong, Yi
    Wei, Dong-Qing
    NATURE MACHINE INTELLIGENCE, 2022, 4 (03) : 300 - +
  • [4] Transformer-Based Generative Model Accelerating the Development of Novel BRAF Inhibitors
    Yang, Lijuan
    Yang, Guanghui
    Bing, Zhitong
    Tian, Yuan
    Niu, Yuzhen
    Huang, Liang
    Yang, Lei
    ACS OMEGA, 2021, 6 (49): : 33864 - 33873
  • [5] TransDTI: Transformer-Based Language Models for Estimating DTIs and Building a Drug Recommendation Workflow
    Kalakoti, Yogesh
    Yadav, Shashank
    Sundar, Durai
    ACS OMEGA, 2022, 7 (03): : 2706 - 2717
  • [6] TransPTM: a transformer-based model for non-histone acetylation site prediction
    Meng, Lingkuan
    Chen, Xingjian
    Cheng, Ke
    Chen, Nanjun
    Zheng, Zetian
    Wang, Fuzhou
    Sun, Hongyan
    Wong, Ka-Chun
    BRIEFINGS IN BIOINFORMATICS, 2024, 25 (03)
  • [7] Transformer-based settlement prediction model of pile composite foundation under embankment loading
    Gao, Song
    Chen, Changfu
    Jiang, Xueqin
    Zhu, Shimin
    Cai, Huan
    Li, Wei
    COMPUTERS AND GEOTECHNICS, 2024, 176
  • [8] Generating three-dimensional bioinspired microstructures using transformer-based generative adversarial network
    Chiang, Yu-Hsuan
    Tseng, Bor-Yann
    Wang, Jyun-Ping
    Chen, Yu-Wen
    Tung, Cheng-Che
    Yu, Chi-Hua
    Chen, Po-Yu
    Chen, Chuin-Shan
    JOURNAL OF MATERIALS RESEARCH AND TECHNOLOGY-JMR&T, 2023, 27 : 6117 - 6134
  • [9] A 4.12 GHz, 3.3 mW VCO with 87.9% tuning range using a transformer-based variable inductance
    Silva, Rodrigo Godinho
    Verastegui, Thomaz Milton Navarro
    Leite, Bernardo
    Mariano, Andre
    AEU-INTERNATIONAL JOURNAL OF ELECTRONICS AND COMMUNICATIONS, 2023, 163
  • [10] An Optimization Method for Coupled-Line Bandpass Filters Using Transformer-Based Estimator and Multilayer Perceptron
    Xu, Kai-Da
    Tian, Jing
    Cai, Yijun
    Li, Daotong
    Wu, Wen
    Chen, Qiang
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2024, 71 (09) : 4136 - 4140