Squeeze-and-Excitation Networks

https://arxiv.org/pdf/1709.01507.pdf

Channel-wise attention via global pooling + C/2 Fully Connected + C Fully Connected. Multiply channel-wise attention on the original feature map.

Good improvements on multiple networks. ImageNet 2017 winner.

Last updated