Antiviral Peptide-Generative Pre-Trained Transformer (AVP-GPT): A Deep Learning-Powered Model for Antiviral Peptide Design with High-Throughput Discovery and Exceptional Potency

抗病毒肽生成预训练Transformer(AVP-GPT):一种基于深度学习的抗病毒肽设计模型,具有高通量发现和卓越效力

阅读:1

Abstract

Traditional antiviral peptide (AVP) discovery is a time-consuming and expensive process. This study introduces AVP-GPT, a novel deep learning method utilizing transformer-based language models and multimodal architectures specifically designed for AVP design. AVP-GPT demonstrated exceptional efficiency, generating 10,000 unique peptides and identifying potential AVPs within two days on a GPU system. Pre-trained on a respiratory syncytial virus (RSV) dataset, AVP-GPT successfully adapted to influenza A virus (INFVA) and other respiratory viruses. Compared to state-of-the-art models like LSTM and SVM, AVP-GPT achieved significantly lower perplexity (2.09 vs. 16.13) and higher AUC (0.90 vs. 0.82), indicating superior peptide sequence prediction and AVP classification. AVP-GPT generated a diverse set of peptides with excellent novelty and identified candidates with remarkably higher antiviral success rates than conventional design methods. Notably, AVP-GPT generated novel peptides against RSV and INFVA with exceptional potency, including four peptides exhibiting EC50 values around 0.02 uM-the strongest anti-RSV activity reported to date. These findings highlight AVP-GPT's potential to revolutionize AVP discovery and development, accelerating the creation of novel antiviral drugs. Future studies could explore the application of AVP-GPT to other viral targets and investigate alternative AVP design strategies.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。