A fully automated knee magnetic resonance imaging (MRI) segmentation method to study osteoarthritis (OA) was developed using a novel hierarchical set of random forests (RF) classifiers to learn the appearance of cartilage regions and their boundaries. A neighborhood approximation forest is used first to provide contextual feature to the second-level RF classifier that also considers local features and produces location-specific costs for the layered optimal graph image segmentation of multiple objects and surfaces (LOGISMOS) framework. Double-echo steady state MRIs used in this paper originated from the OA Initiative study. Trained on 34 MRIs with varying degrees of OA, the performance of the learning-based method tested on 108 MRIs showed significant reduction in segmentation errors ( ) compared with the conventional gradient-based and single-stage RF-learned costs. The 3-D LOGISMOS was extended to longitudinal-3-D (4-D) to simultaneously segment multiple follow-up visits of the same patient. As such, data from all time-points of the temporal sequence contribute information to a single optimal solution that utilizes both spatial 3-D and temporal contexts. 4-D LOGISMOS validation on 108 MRIs from baseline, and 12 month follow-up scans of 54 patients showed significant reduction in segmentation errors ( ) compared with 3-D. Finally, the potential of 4-D LOGISMOS was further explored on the same 54 patients using five annual follow-up scans demonstrating a significant improvement of measuring cartilage thickness ( ) compared with the sequential 3-D approach.
Learning-Based Cost Functions for 3-D and 4-D Multi-Surface Multi-Object Segmentation of Knee MRI: Data From the Osteoarthritis Initiative.
阅读:12
作者:Kashyap Satyananda, Zhang Honghai, Rao Karan, Sonka Milan
| 期刊: | IEEE Transactions on Medical Imaging | 影响因子: | 9.800 |
| 时间: | 2018 | 起止号: | 2018 May;37(5):1103-1113 |
| doi: | 10.1109/TMI.2017.2781541 | ||
特别声明
1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。
2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。
3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。
4、投稿及合作请联系:info@biocloudy.com。
