Parameter-efficient fine-tuning enables scalable transfer of regulatory sequence models to novel contexts

参数高效的微调能够将调控序列模型可扩展地迁移到新的环境中。

阅读:2

Abstract

BACKGROUND: DNA sequence deep learning models can accurately predict epigenetic and transcriptional profiles, enabling analysis of gene regulation and genetic variant effects. While large-scale models like Enformer and Borzoi are trained on abundant data, they cannot cover all cell states and assays, necessitating training new model to analyze gene regulation in novel contexts. However, training models from scratch for new datasets is computationally expensive. RESULTS: In this study, we systematically develop and evaluate a transfer learning framework based on parameter-efficient fine-tuning for supervised regulatory sequence models. Using the state-of-the-art model Borzoi, our framework enables accurate model transfer while significantly reducing runtime and memory requirements. Across bulk and single cell RNA-seq datasets, the transferred models effectively predict held-out gene expression changes, identify regulatory drivers in perturbation conditions, and predict cell-type-specific variant effects. We further demonstrate that transferring Borzoi to relevant cell types facilitates mechanistic interpretation of fine-mapped GWAS variants. CONCLUSIONS: Our framework offers a scalable and practical solution for extending large sequence models to novel biological contexts, enabling mechanistic insight into gene regulation and variant effects.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。