Deep quanvolutional neural networks with enhanced trainability and gradient propagation

具有增强的可训练性和梯度传播能力的深度量子卷积神经网络

阅读:1

Abstract

In this paper, we explore methods to enhance the performance of one of the frequently used variants of Quantum Convolutional Neural Networks, known as Quanvolutional Neural Networks (QuNNs) by introducing trainable quanvolutional layers and addressing the challenges associated with training multi-layered or deep QuNNs. Traditional QuNNs mostly rely on static (non-trainable) quanvolutional layers, limiting their feature extraction capabilities. Our approach enables the training of these layers, significantly improving the scalability and learning potential of QuNNs. However, multi-layered deep QuNNs face difficulties in gradient-based optimization due to limited gradient flow across all the layers of the network. To overcome this, we propose Residual Quanvolutional Neural Networks (ResQuNNs), which utilize residual learning by adding skip connections between quanvolutional layers. These residual blocks enhance gradient flow throughout the network, facilitating effective training in deep QuNNs, thus enabling deep learning in QuNNs. Moreover, we provide empirical evidence on the optimal placement of these residual blocks, demonstrating how strategic configurations improve gradient flow and lead to more efficient training. Our findings represent a significant advancement in quantum deep learning, opening new possibilities for both theoretical exploration and practical quantum computing applications.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。