Lunit
Back to List

BAM: Bottleneck Attention Module

Jongchan Park et al. — BMVC 2018

Recent advances in deep neural networks have been developed via architecture search in depth, width, and cardinality. In this work, we focus on the effect of attention in general deep neural networks. We propose a simple and effective attention module, named Bottleneck Attention Module (BAM), that can be integrated with any feed-forward convolutional neural networks. Our module infers an attention map along two separate pathways, channel and spatial. We place our module at each bottleneck of models where the downsampling of feature maps occurs. Our module constructs a hierarchical attention at bottlenecks with a number of parameters and it is trainable in an end-to-end manner jointly with any feed-forward models. We validate our BAM through extensive experiments on CIFAR-100, ImageNet-1K, VOC 2007 and MS COCO benchmarks. Our experiments show consistent improvement in classification and detection performances with various models, demonstrating the wide applicability of BAM. The code and models will be publicly available.

전체 내용 보기
AUTHORS

Jongchan Park1, Sanghyun Woo2, Joon-Young Lee3 and In So Kweon2

1Lunit Inc., 2KAIST, 3Adobe Research

PUBLISHED
BMVC 2018

Read more