Artificial confocal microscopy for deep label-free imaging

用于深度无标记成像的人工共聚焦显微镜

阅读:14
作者:Xi Chen, Mikhail E Kandel, Shenghua He, Chenfei Hu, Young Jae Lee, Kathryn Sullivan, Gregory Tracy, Hee Jung Chung, Hyun Joon Kong, Mark Anastasio, Gabriel Popescu

Abstract

Widefield microscopy of optically thick specimens typically features reduced contrast due to "spatial crosstalk", in which the signal at each point in the field of view is the result of a superposition from neighbouring points that are simultaneously illuminated. In 1955, Marvin Minsky proposed confocal microscopy as a solution to this problem. Today, laser scanning confocal fluorescence microscopy is broadly used due to its high depth resolution and sensitivity, but comes at the price of photobleaching, chemical, and photo-toxicity. Here, we present artificial confocal microscopy (ACM) to achieve confocal-level depth sectioning, sensitivity, and chemical specificity, on unlabeled specimens, nondestructively. We equipped a commercial laser scanning confocal instrument with a quantitative phase imaging module, which provides optical path-length maps of the specimen in the same field of view as the fluorescence channel. Using pairs of phase and fluorescence images, we trained a convolution neural network to translate the former into the latter. The training to infer a new tag is very practical as the input and ground truth data are intrinsically registered, and the data acquisition is automated. The ACM images present significantly stronger depth sectioning than the input (phase) images, enabling us to recover confocal-like tomographic volumes of microspheres, hippocampal neurons in culture, and 3D liver cancer spheroids. By training on nucleus-specific tags, ACM allows for segmenting individual nuclei within dense spheroids for both cell counting and volume measurements. In summary, ACM can provide quantitative, dynamic data, nondestructively from thick samples, while chemical specificity is recovered computationally.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。