Bright-field to fluorescence microscopy image translation for cell nuclei health quantification

明场到荧光显微镜图像转换用于细胞核健康定量分析

阅读:2

Abstract

Microscopy is a widely used method in biological research to observe the morphology and structure of cells. Amongst the plethora of microscopy techniques, fluorescent labeling with dyes or antibodies is the most popular method for revealing specific cellular organelles. However, fluorescent labeling also introduces new challenges to cellular observation, as it increases the workload, and the process may result in nonspecific labeling. Recent advances in deep visual learning have shown that there are systematic relationships between fluorescent and bright-field images, thus facilitating image translation between the two. In this article, we propose the cross-attention conditional generative adversarial network (XAcGAN) model. It employs state-of-the-art GANs (GANs) to solve the image translation task. The model uses supervised learning and combines attention-based networks to explore spatial information during translation. In addition, we demonstrate the successful application of XAcGAN to infer the health state of translated nuclei from bright-field microscopy images. The results show that our approach achieves excellent performance both in terms of image translation and nuclei state inference.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。