Neural architecture search via progressive partial connection with attention mechanism

通过渐进式部分连接和注意力机制进行神经架构搜索

阅读:1

Abstract

Differentiable architecture search requires a larger computational consumption during architecture search, and there exists the depth gap problem under deeper network architecture. In this paper, we propose an attention-based progressive partially connected neural architecture search method (PPCAtt-NAS) to address these two issues. First, we introduce a progressive search strategy in the architecture search phase, build up the sophistication of the architecture gradually and perform path-level pruning in stages to bridge the depth gap. Second, we adopt a partial search scheme that performs channel-level partial sampling of the network architecture to further reduce the computational complexity of the architecture search. In addition, an attention mechanism is devised to improve the architecture search capability by enhancing the relevance between the feature channels. Finally, we conduct extensive comparison experiments with state-of-the-art methods on several public datasets, and our method is able to present higher architecture performance.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。