搜索

x

留言板

姓名
邮箱
手机号码
标题
留言内容
验证码

downloadPDF
引用本文:
Citation:

汤天一, 熊翊名, 张睿格, 张建, 李文飞, 王骏, 王炜

Progress in protein pre-training models integrating structural knowledge

Tang Tian-Yi, Xiong Yi-Ming, Zhang Rui-Ge, Zhang Jian, Li Wen-Fei, Wang Jun, Wang Wei
PDF
HTML
导出引用
  • 自然语言和图像处理领域引发的人工智能革命给蛋白质计算领域带来了新的思路和研究范式. 其中一个重大的进展是从海量蛋白质序列通过自监督学习得到预训练的蛋白质语言模型. 这类预训练模型编码了蛋白质的序列、进化、结构乃至功能等多种信息, 可方便地迁移至多种下游任务, 并展现了强大的泛化能力. 在此基础上, 人们正进一步发展融合更多种类数据的多模态预训练模型. 考虑到蛋白质结构是决定其功能的主要因素, 融合了结构信息的蛋白质预训练模型可更好地支持下游多种任务, 本文对这一方向的研究工作进行了介绍和总结. 此外, 还简介了融合先验知识的蛋白质预训练模型、RNA语言模型、蛋白质设计等方面的工作, 讨论了这些领域目前的现状、困难及可能的解决方案.
    The AI revolution, sparked by natural language and image processing, has brought new ideas and research paradigms to the field of protein computing. One significant advancement is the development of pre-training protein language models through self-supervised learning from massive protein sequences. These pre-trained models encode various information about protein sequences, evolution, structures, and even functions, which can be easily transferred to various downstream tasks and demonstrate robust generalization capabilities. Recently, researchers have further developed multimodal pre-trained models that integrate more diverse types of data. The recent studies in this direction are summarized and reviewed from the following aspects in this paper. Firstly, the protein pre-training models that integrate protein structures into language models are reviewed: this is particularly important, for protein structure is the primary determinant of its function. Secondly, the pre-trained models that integrate protein dynamic information are introduced. These models may benefit downstream tasks such as protein-protein interactions, soft docking of ligands, and interactions involving allosteric proteins and intrinsic disordered proteins. Thirdly, the pre-trained models that integrate knowledge such as gene ontology are described. Fourthly, we briefly introduce pre-trained models in RNA fields. Finally, we introduce the most recent developments in protein designs and discuss the relationship of these models with the aforementioned pre-trained models that integrate protein structure information.
        通信作者:张建,jzhang@nju.edu.cn; 王炜,wangwei@nju.edu.cn
      • 基金项目:科技部科技创新项目(批准号: 2030-2021ZD0201300)和国家自然科学基金(批准号: 11934008)资助的课题.
        Corresponding author:Zhang Jian,jzhang@nju.edu.cn; Wang Wei,wangwei@nju.edu.cn
      • Funds:Project supported by the Science and Technology Innovation Project of the Ministry of Science and Technology (Grant No. 2030-2021ZD0201300) and the National Natural Science Foundation of China (Grant No. 11934008).
      [1]

      [2]

      [3]

      [4]

      [5]

      [6]

      [7]

      [8]

      [9]

      [10]

      [11]

      [12]

      [13]

      [14]

      [15]

      [16]

      [17]

      [18]

      [19]

      [20]

      [21]

      [22]

      [23]

      [24]

      [25]

      [26]

      [27]

      [28]

      [29]

      [30]

      [31]

      [32]

      [33]

      [34]

      [35]

      [36]

      [37]

      [38]

      [39]

      [40]

      [41]

      [42]

      [43]

      [44]

      [45]

      [46]

      [47]

      [48]

      [49]

      [50]

      [51]

      [52]

      [53]

      [54]

      [55]

      [56]

      [57]

      [58]

      [59]

      [60]

      [61]

      [62]

      [63]

      [64]

      [65]

      [66]

      [67]

      [68]

      [69]

      [70]

      [71]

      [72]

      [73]

      [74]

      [75]

      [76]

      [77]

      [78]

      [79]

      [80]

      [81]

      [82]

      [83]

      [84]

      [85]

      [86]

      [87]

      [88]

      [89]

      [90]

      [91]

      [92]

      [93]

      [94]

      [95]

      [96]

      [97]

      [98]

      [99]

      [100]

      [101]

      [102]

      [103]

      [104]

      [105]

      [106]

      [107]

      [108]

      [109]

      [110]

      [111]

      [112]

      [113]

      [114]

      [115]

      [116]

      [117]

      [118]

      [119]

      [120]

      [121]

      [122]

    • 模型名 时间 模型 数据模态 预训练方法 训练集 参数量 算力要求 下游任务 文献
      融合了结构信息的通用蛋白质预训练模型
      Bepler &Berger 2019 Bi-LSTM sequence, structure MLM for sequences, supervised learning for 3D structures 76M sequences, 28K structures 1X 32G-V100, 13 to 51 days Fold classification transmembrane region prediction [19,42]
      Guo model 2022 CNN structure Self-supervised pre-training on noised pair-distance 73K structures QA, PPI [43]
      New IEConv 2022 GCN sequence, structure Contrastive learning between randomly sampled 3D substructures 476K chains 30M protein function prediction, protein fold classification, structural similarity prediction, protein-ligand binding affinity prediction [44]
      GearNet 2023 ESM-1b, GearNet sequence, structure PLM, contrastive learning 805K structures from AlphaFoldDB 4X A100 Fold classification, EC, GO
      STEPS 2023 BERT, GCN sequence, structure PLM, supervised learning from 3D structures 40K structures Membrane protein classification, cellular location prediction, EC
      UNI-MOL 2023 Transformer sequence, structure Atom 3D position denoise, masked atom type prediction 209M molecule conformations, 3.2M protein pockets structure 8X 32G-V100, 3 days molecular property prediction, molecular conformation generation, pocket property prediction, protein-ligand binding pose prediction
      SaProt 2023 BERT sequence, structure Convert structures to structure-aware vocabulary, MLM 40M sequences and structures from PDB/AlphaFoldDB 650M 64X 80G-A100, 3 months Thermostability, HumanPPI, Metal Ion Binding, EC, GO, DeepLoc, contact prediction [51]
      融合了结构信息的非通用蛋白质预训练模型
      Evoformer 2021 Evoformer sequence, structure MLM, Supervised learning BPD+Uniclust30, PDB 128TPU-v3, 11 days Structure prediction [2]
      DeepFRI 2021 LSTM+GCN sequence, structure PLM(pretrained, frozen), supervised learning for 3D structures 10M sequences for pre-training GO, EC, PPI interaction sites [47]
      LM-GVP 2022 Transformer +GVP sequence, structure PLM(changeable), supervised learning for 3D structures 8X 32G-V100 fluorescence, protease stability, GO, mutational effects [48]
      ProNet 2023 GCN sequence, structure supervised learning Fold classification, reaction classification, binding affinity, PI
      HoloProt 2022 MPN sequence, structure surface supervised learning 1.8M 1X 1080Ti, 1 day Ligand binding affinity, EC [56]
      编码动态三维结构信息的预训练模型
      ProtMD 2022 E(3)-Equivariant Graph Matching Network sequence, structure trajectory Self-supervised learning, atom-level prompt-based denoising generative task, conformation-level snapshot ordering task 62.8K snapshots from MD for 64 protein-ligand pairs 5.2M 4X V100 Binding affinity prediction, binary classification of ligand efficacy [58]
      融合了知识的蛋白质预训练模型
      OntoProtein 2022 ProtBert, Gu-model sequence, knowledge MLM, contrastive learning ProteinKG25 with 5M knowledge triples V100 TAPE, PPI, Protein function prediction [60]
      KeAP 2023 ProtBert, Gu-model sequence, knowledge MLM ProteinKG25 TAPE, PPI, Protein function prediction [62]
      ProtST 2023 ProtBert, ESM-1b, ESM-2, PubMedBert sequence, knowledge MLM, Multimodal Representation Alignment, Multimodal Mask Prediction ProtDescribe with 553K sequence-property pairs 4X V100 Protein localization prediction, Fitness landscape prediction, Protein function annotation [63]
      RNA语言模型
      RNA-FM 2022.8 BERT Sequence MLM RNAcentral, 23.7M ncRNA sequences 8X A100 80G, 1 month SS prediction, 3D contact/distance map, 3D reconstruction, evolution study, RNA-protein interaction, MRL prediction [78]
      RNABert 2022 BERT Sequence MLM RNAcentral (762K) & Rfam 14.3 dataset V100 structural alignment, clustering [86]
      SpliceBERT 2023 BERT Sequence MLM Pre-mRNA of 72 vertebrates, 2M sequences, 64B nucleotides 19.4M 8X V100, 1 week multi-species splice site prediction, human branch point prediction [79]
      RNA-MSM 2023 MSA-transformer Sequence MLM 4069 RNA families from Rfam 14.7 8X V100 32G SS prediction, solvent accessibility prediction [83]
      Uni-RNA 2023 BERT Sequence MLM RNAcentral & nt & GWH (1billion sequences) 25—400M 128X A100 SS prediction, 3D structure prediction, MRL, Isoform percentage prediction on 3’UTR, splice site prediction, classification of ncRNA functional families, modification site prediction [84]
      RNAErnie 2024 ERNIE Sequence, motif information MLM at base/subsequence/motif level masking RNAcentral, 23M ncRNA sequences 105M 4X V100 32G, 250 hours sequence classification, RNA–RNA interaction, SS prediction [85]
      *PLM, protein language model; MLM, masked language model; GCN: graph convolutional network; GVP: geometric vector perceptrons; EC: enzyme commission number prediction; GO: gene ontology term prediction; PPI, protein-protein interaction; TAPE, the tasks assessing protein embeddings database; QA, quality assessment of structures; SS, secondary structure; MRL, mean ribosome load prediction in mRNA.
      下载: 导出CSV
    • [1]

      [2]

      [3]

      [4]

      [5]

      [6]

      [7]

      [8]

      [9]

      [10]

      [11]

      [12]

      [13]

      [14]

      [15]

      [16]

      [17]

      [18]

      [19]

      [20]

      [21]

      [22]

      [23]

      [24]

      [25]

      [26]

      [27]

      [28]

      [29]

      [30]

      [31]

      [32]

      [33]

      [34]

      [35]

      [36]

      [37]

      [38]

      [39]

      [40]

      [41]

      [42]

      [43]

      [44]

      [45]

      [46]

      [47]

      [48]

      [49]

      [50]

      [51]

      [52]

      [53]

      [54]

      [55]

      [56]

      [57]

      [58]

      [59]

      [60]

      [61]

      [62]

      [63]

      [64]

      [65]

      [66]

      [67]

      [68]

      [69]

      [70]

      [71]

      [72]

      [73]

      [74]

      [75]

      [76]

      [77]

      [78]

      [79]

      [80]

      [81]

      [82]

      [83]

      [84]

      [85]

      [86]

      [87]

      [88]

      [89]

      [90]

      [91]

      [92]

      [93]

      [94]

      [95]

      [96]

      [97]

      [98]

      [99]

      [100]

      [101]

      [102]

      [103]

      [104]

      [105]

      [106]

      [107]

      [108]

      [109]

      [110]

      [111]

      [112]

      [113]

      [114]

      [115]

      [116]

      [117]

      [118]

      [119]

      [120]

      [121]

      [122]

    • [1] 张旭, 丁进敏, 侯晨阳, 赵一鸣, 刘鸿维, 梁生.基于机器学习的激光匀光整形方法. 必威体育下载 , 2024, 73(16): 164205.doi:10.7498/aps.73.20240747
      [2] 杨添, 欧阳颀.利用冷冻电镜研究蛋白质机器的非平衡统计物理. 必威体育下载 , 2024, 73(13): 138701.doi:10.7498/aps.73.20240592
      [3] 张嘉晖.蛋白质计算中的机器学习. 必威体育下载 , 2024, 73(6): 069301.doi:10.7498/aps.73.20231618
      [4] 刘烨, 牛赫然, 李兵兵, 马欣华, 崔树旺.机器学习在宇宙线粒子鉴别中的应用. 必威体育下载 , 2023, 72(14): 140202.doi:10.7498/aps.72.20230334
      [5] 管星悦, 黄恒焱, 彭华祺, 刘彦航, 李文飞, 王炜.生物分子模拟中的机器学习方法. 必威体育下载 , 2023, 72(24): 248708.doi:10.7498/aps.72.20231624
      [6] 刘栋, 崔新月, 王浩东, 张贵军.蛋白质结构模型质量评估方法综述. 必威体育下载 , 2023, 72(24): 248702.doi:10.7498/aps.72.20231071
      [7] 林开东, 林晓倩, 林绪波.靶向PD-L1蛋白的计算机辅助药物筛选. 必威体育下载 , 2023, 72(24): 240501.doi:10.7498/aps.72.20231068
      [8] 张逸凡, 任卫, 王伟丽, 丁书剑, 李楠, 常亮, 周倩.机器学习结合固溶强化模型预测高熵合金硬度. 必威体育下载 , 2023, 72(18): 180701.doi:10.7498/aps.72.20230646
      [9] 陈光临, 张志勇.使用中间层受监督的自编码器探索蛋白质的构象空间. 必威体育下载 , 2023, 72(24): 248705.doi:10.7498/aps.72.20231060
      [10] 罗方芳, 蔡志涛, 黄艳东.蛋白质pKa预测模型研究进展. 必威体育下载 , 2023, 72(24): 248704.doi:10.7498/aps.72.20231356
      [11] 林键, 叶梦, 朱家纬, 李晓鹏.机器学习辅助绝热量子算法设计. 必威体育下载 , 2021, 70(14): 140306.doi:10.7498/aps.70.20210831
      [12] 陈江芷, 杨晨温, 任捷.基于波动与扩散物理系统的机器学习. 必威体育下载 , 2021, 70(14): 144204.doi:10.7498/aps.70.20210879
      [13] 刘春杰, 赵新军, 高志福, 蒋中英.高分子混合刷吸附/脱附蛋白质的模型化研究. 必威体育下载 , 2021, 70(22): 224701.doi:10.7498/aps.70.20211219
      [14] 史晨阳, 闵光宗, 刘向阳.蛋白质基忆阻器研究进展. 必威体育下载 , 2020, 69(17): 178702.doi:10.7498/aps.69.20200617
      [15] 袁飞, 张传彪, 周昕, 黎明.基于氨基酸位置特异性的蛋白质Loop区结构预测改进方法. 必威体育下载 , 2016, 65(15): 158701.doi:10.7498/aps.65.158701
      [16] 邓海游, 贾亚, 张阳.蛋白质结构预测. 必威体育下载 , 2016, 65(17): 178701.doi:10.7498/aps.65.178701
      [17] 万茜, 周进, 刘曾荣.蛋白质相互作用网络特征的理论再现. 必威体育下载 , 2012, 61(1): 010203.doi:10.7498/aps.61.010203
      [18] 丁玮, 江凡.蛋白质晶体结构刚体优化的新方法. 必威体育下载 , 2011, 60(4): 046103.doi:10.7498/aps.60.046103
      [19] 闫循领, 董瑞新, 王伯运.α螺旋蛋白质螺旋线模型的耦合孤立子. 必威体育下载 , 1999, 48(4): 751-756.doi:10.7498/aps.48.751
      [20] 阎循领, 董瑞新, 王伯运, 胡海泉, 徐炳振.α螺旋蛋白质分子Raman光谱的选择定则. 必威体育下载 , 1998, 47(12): 1963-1967.doi:10.7498/aps.47.1963
    计量
    • 文章访问数:330
    • PDF下载量:6
    • 被引次数:0
    出版历程
    • 收稿日期:2024-06-07
    • 修回日期:2024-07-12
    • 上网日期:2024-08-09

      返回文章
      返回
        Baidu
        map