Uncover spatially informed variations for single-cell spatial transcriptomics with STew

利用 STew 揭示单细胞空间转录组学中的空间信息变异

阅读:1

Abstract

MOTIVATION: The recent spatial transcriptomics (ST) technologies have enabled characterization of gene expression patterns and spatial information, advancing our understanding of cell lineages within diseased tissues. Several analytical approaches have been proposed for ST data, but effectively utilizing spatial information to unveil the shared variation with gene expression remains a challenge. RESULTS: We introduce STew, a Spatial Transcriptomic multi-viEW representation learning method, to jointly analyze spatial information and gene expression in a scalable manner, followed by a data-driven statistical framework to measure the goodness of model fit. Through benchmarking using human dorsolateral prefrontal cortex and mouse main olfactory bulb data with true manual annotations, STew achieved superior performance in both clustering accuracy and continuity of identified spatial domains compared with other methods. STew is also robust to generate consistent results insensitive to model parameters, including sparsity constraints. We next applied STew to various ST data acquired from 10× Visium, Slide-seqV2, and 10× Xenium, encompassing single-cell and multi-cellular resolution ST technologies, which revealed spatially informed cell type clusters and biologically meaningful axes. In particular, we identified a proinflammatory fibroblast spatial niche using ST data from psoriatic skins. Moreover, STew scales almost linearly with the number of spatial locations, guaranteeing its applicability to datasets with thousands of spatial locations to capture disease-relevant niches in complex tissues. AVAILABILITY AND IMPLEMENTATION: Source code and the R software tool STew are available from github.com/fanzhanglab/STew.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。