Hybrid of VGG-16 and FTVT-b16 Models to Enhance Brain Tumors Classification Using MRI Images

结合VGG-16和FTVT-b16模型的混合模型可增强基于MRI图像的脑肿瘤分类。

阅读:1

Abstract

Background: The accurate classification of brain tumors from magnetic resonance imaging (MRI) scans is pivotal for timely clinical intervention, yet remains challenged by tumor heterogeneity, morphological variability, and imaging artifacts. Methods: This paper presents a novel hybrid approach for improved brain tumor classification and proposes a novel hybrid deep learning framework that amalgamates the hierarchical feature extraction capabilities of VGG-16, a convolutional neural network (CNN), with the global contextual modeling of FTVT-b16, a fine-tuned vision transformer (ViT), to advance the precision of brain tumor classification. To evaluate the recommended method's efficacy, two widely known MRI datasets were utilized in the experiments. The first dataset consisted of 7.023 MRI scans categorized into four classes gliomas, meningiomas, pituitary tumors, and no tumor. The second dataset was obtained from Kaggle, which consisted of 3000 scans categorized into two classes, consisting of healthy brains and brain tumors. Results: The proposed framework addresses critical limitations of conventional CNNs (local receptive fields) and pure ViTs (data inefficiency), offering a robust, interpretable solution aligned with clinical workflows. These findings underscore the transformative potential of hybrid architectures in neuro-oncology, paving the way for AI-assisted precision diagnostics. The proposed framework was run on these two different datasets and demonstrated outstanding performance, with accuracy of 99.46% and 99.90%, respectively. Conclusions: Future work will focus on multi-institutional validation and computational optimization to ensure scalability in diverse clinical settings.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。