scPerb: Predict single-cell perturbation via style transfer-based variational autoencoder

scPerb:基于风格迁移的变分自编码器预测单细胞扰动

阅读:1

Abstract

INTRODUCTION: Traditional methods for obtaining cellular responses after perturbation are usually labor-intensive and costly, especially when working with multiple different experimental conditions. Therefore, accurate prediction of cellular responses to perturbations is of great importance in computational biology. Existing methodologies, such as graph-based approaches, vector arithmetic, and neural networks, either mix perturbation-related variances with cell-type-specific patterns or implicitly distinguish them within black-box models. OBJECTIVES: This study aims to introduce and demonstrate a novel framework, scPerb, which explicitly extracts perturbation-related variances and transfers them from unperturbed to perturbed cells to accurately predict the effect of perturbation in single-cell level. METHODS: scPerb utilizes a style transfer strategy by incorporating a style encoder into the architecture of a variational autoencoder. The style encoder captures the differences in latent representations between unperturbed and perturbed cells, enabling accurate prediction of post-perturbation gene expression data. RESULTS: Comprehensive comparisons with existing methods demonstrate that scPerb delivers improved performance and higher accuracy in predicting cellular responses to perturbations. Notably, scPerb outperforms other methods across multiple datasets, achieving superior R(2) values of 0.98, 0.98, and 0.96 on three benchmarking datasets. CONCLUSION: scPerb offers a significant advancement in predicting cellular responses by effectively separating and transferring perturbation-related variances. This framework not only enhances prediction accuracy but also provides a robust tool for computational biology, addressing the limitations of current methodologies.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。