Artificial neural networks trained on simulated multispectral data for real-time imaging of skin microcirculatory blood oxygen saturation

基于模拟多光谱数据训练的人工神经网络用于实时成像皮肤微循环血氧饱和度

阅读:2

Abstract

SIGNIFICANCE: Imaging blood oxygen saturation ( SO2 ) in the skin can be of clinical value when studying ischemic tissue. Emerging multispectral snapshot cameras enable real-time imaging but are limited by slow analysis when using inverse Monte Carlo (MC), the gold standard for analyzing multispectral data. Using artificial neural networks (ANNs) facilitates a significantly faster analysis but requires a large amount of high-quality training data from a wide range of tissue types for a precise estimation of SO2 . AIM: We aim to develop a framework for training ANNs that estimates SO2 in real time from multispectral data with a precision comparable to inverse MC. APPROACH: ANNs are trained using synthetic data from a model that includes MC simulations of light propagation in tissue and hardware characteristics. The model includes physiologically relevant variations in optical properties, unique sensor characteristics, variations in illumination spectrum, and detector noise. This approach enables a rapid way of generating high-quality training data that covers different tissue types and skin pigmentation. RESULTS: The ANN implementation analyzes an image in 0.11 s, which is at least 10,000 times faster than inverse MC. The hardware modeling is significantly improved by an in-house calibration of the sensor spectral response. An in-vivo example shows that inverse MC and ANN give almost identical SO2 values with a mean absolute deviation of 1.3%-units. CONCLUSIONS: ANN can replace inverse MC and enable real-time imaging of microcirculatory SO2 in the skin if detailed and precise modeling of both tissue and hardware is used when generating training data.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。