Attention-Based Batch Normalization for Binary Neural Networks

基于注意力机制的二元神经网络批量归一化

阅读:1

Abstract

Batch normalization (BN) is crucial for achieving state-of-the-art binary neural networks (BNNs). Unlike full-precision neural networks, BNNs restrict activations to discrete values {-1,1}, which requires a renewed understanding and research of the role and significance of the BN layers in BNNs. Many studies notice this phenomenon and try to explain it. Inspired by these studies, we introduce the self-attention mechanism into BN and propose a novel Attention-Based Batch Normalization (ABN) for Binary Neural Networks. Also, we present an ablation study of parameter trade-offs in ABN, as well as an experimental analysis of the effect of ABN on BNNs. Experimental analyses show that our ABN method helps to capture image features, provide additional activation-like functions, and increase the imbalance of the activation distribution, and these features help to improve the performance of BNNs. Furthermore, we conduct image classification experiments over the CIFAR10, CIFAR100, and TinyImageNet datasets using BinaryNet and ResNet-18 network structures. The experimental results demonstrate that our ABN consistently outperforms the baseline BN across various benchmark datasets and models in terms of image classification accuracy. In addition, ABN exhibits less variance on the CIFAR datasets, which suggests that ABN can improve the stability and reliability of models.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。