Optimized UNet framework with a joint loss function for underwater image enhancement.

阅读:3
作者:Wang Xin, Luo Zhonghua, Huang Wei, Zhang Yizhou, Hu Rongqun
As the water economy advances and the concepts of water ecology protection and sustainable development take root in people's minds, underwater imaging equipment has made remarkable progress. However, due to various factors, underwater images still suffer from low quality. How to enhance the quality of underwater images so that people can understand them quickly has become a crucial issue. Therefore, aiming at the degradation problems such as detail blurring, color imbalance, and noise interference in low-quality underwater images, this paper proposes an optimized UNet framework with a joint loss function (OUNet-JL). Firstly, to alleviate the problem of detail blurring, we construct a multi-residual module (MRM) to enhance the ability to represent detail features by using serially stacked convolutional blocks and residual connections. Secondly, we build a spatial multi-scale feature extraction module fused with channel attention (SMFM) to address the color imbalance issue through multi-scale dilated convolution and channel attention. Thirdly, to improve the signal-to-noise ratio of the enhanced image and solve the problem of blurring distortion, a strengthen-operate-subtract feature reconstruction module (SOSFM) is presented. Fourthly, to guide the network to perform training more efficiently and help it converge rapidly, a joint loss function is designed by integrating four different loss functions. Extensive experiments conducted on the well-known UIEB and UFO-120 datasets have shown the superiority of our OUNet-JL compared with several state-of-the-art algorithms. Moreover, ablation studies have also verified the effectiveness of the proposed modules. Our source code is publicly available at https://github.com/WangXin81/OUNet_JL .

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。