Squeeze-and-Excitation Networks
Last updated
Last updated
https://arxiv.org/pdf/1709.01507.pdf
Channel-wise attention via global pooling + C/2 Fully Connected + C Fully Connected. Multiply channel-wise attention on the original feature map.
Good improvements on multiple networks. ImageNet 2017 winner.