Predicting RNA Structure Utilizing Attention from Pretrained Language Models

利用预训练语言模型的注意力机制预测RNA结构

阅读:2

Abstract

RNA possesses functional significance that extends beyond the transport of genetic information. The functional roles of noncoding RNA can be mediated through their tertiary and secondary structure, and thus, predicting RNA structure holds great promise for unleashing their applications in diagnostics and therapeutics. However, predicting the three-dimensional (3D) structure of RNA remains challenging. Applying artificial intelligence techniques in the context of natural language processing and large language models (LLMs) could incorporate evolutionary information to RNA 3D structure predictions and address both resource and data scarcity limitations. This approach could achieve faster inference times, while keeping similar accuracy outcomes compared to employing time-consuming multiple sequence alignment schemes, akin to its successful application in protein structure prediction. Herein, we evaluate the suitability of currently available pretrained nucleic acid language models (RNABERT, ERNIE-RNA, RNA Foundational Model (RNA-FM), RiboNucleic Acid Language Model (RiNALMo), and DNABERT) to predict secondary and tertiary RNA structures. We demonstrate that current nucleic acid language models do not effectively capture structural information, mainly due to architectural constraints.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。