Dense skip-attention for convolutional networks

卷积网络的密集跳跃注意力机制

阅读:1

Abstract

The attention mechanism plays a crucial role in enhancing model performance by guiding the model to focus on important features. However, existing attention mechanism methods primarily concentrate on learning attention features within individual modules while ignoring interactions among overall attention features. To overcome this limitation, we propose a dense skip-attention method for convolutional networks - a simple but effective approach to boost performance. Our method establishes dense skip-attention connections that interconnect all attention modules, forcing the model to learn interactive attention features within the network architecture. We conduct extensive experiments on the ImageNet 2012 and Microsoft COCO (MS COCO) 2017 datasets to validate the effectiveness of our approach. The experimental results demonstrate that our method improves the performance of existing attention mechanism methods, such as Squeeze-and-Excitation Networks, Efficient Channel Attention Networks and Convolutional Block Attention Module, in tasks like image classification, object detection, and instance segmentation. Notably, it achieves these improvements without significantly increasing model parameters or computational cost, maintaining minimal impact on both aspects.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。