Multiscale feature tuned trans-DeepLabV3+ based semantic segmentation of aerial images using improved red piranha optimization algorithm

基于改进的红食人鱼优化算法的多尺度特征调优的基于Trans-DeepLabV3+的航拍图像语义分割

阅读:1

Abstract

The use of higher-resolution spatial aerial images for semantic segmentation in everyday tasks has increased due to recent advancements in remote sensing and several other applications. Nonetheless, supervised learning necessitates a substantial quantity of images with pixel-level labeling. Currently, available techniques, which are mostly Deep Semantic Segmentation Networks (DSSN), might not be appropriate for application domains with a dearth of labels containing targeted masks of outputs. For "semantic segmentation of higher-quality aerial images", multi-scale semantic details have to be extracted. Many techniques have been executed in recent years to increase the networks' capacity to capture multi-scale details in a variety of ways. However, these techniques consistently exhibit poor efficiency regarding speed and accuracy when dealing with aerial images. In this work, an effective image semantic segmentation method utilizing deep learning techniques is designed using a heuristic technique. Standard information sources are used to collect the aerial photos. The Multi-Scale RetiNex (MSRN) technique is employed to enhance the obtained images' color quality. The Multiscale Feature Tuned-Trans-Deeplabv3+ (MSTDeepLabV3+) system is then used to receive the improved image as its input for the feature extraction task. The Improved Red Piranha Optimization (IRPO) approach is deployed to fine-tune the MSTDeepLabV3+ parameters. The MSTDeepLabV3+ helps to provide the final semantically segmented aerial images. To assess how well the implemented model performs, an experimental setup is carried out. The excellent performance offered by the executed model is proved using the simulation outcome.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。