Skip to content

Pytorch implementation of fine grained visual attention in bilinear CNN

License

Notifications You must be signed in to change notification settings

vkkhare/fine_grainedVA_pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fine Grained Visual Attention

This is a PyTorch implementation of Where to Focus: Deep Attention-based Spatially Recurrent Bilinear Networks for Fine-Grained Visual Recognition by Lin Wu, Yang Wang.

Todo

  • Implement attention loss penalty
  • Implement multi-dimensional hidden state support using linear layers
  • expectation bilinear pooled feature vector
  • reproduce results

Abstract

In this paper, we present a novel attention-based model to automatically, selectively and accurately focus on critical object regions with higher importance against appearance variations. Given an image, two different Convolutional Neural Networks (CNNs) are constructed, where the outputs of two CNNs are correlated through bilinear pooling to simultaneously focus on discriminative regions and extract relevant features. To capture spatial distributions among the local regions with visual attention, soft attention based spatial Long-Short Term Memory units (LSTMs) are incorporated to realize spatially recurrent yet visually selective over local input patterns. Two-stream CNN layers, bilinear pooling layer, spatial recurrent layer with location attention are jointly trained via an end-to-end fashion to serve as the part detector and feature extractor, whereby relevant features are localized and extracted attentively.

Results

Requirements

  • python 3.5+
  • pytorch 0.3+
  • opencv
  • numpy
  • pandas

Usage

The easiest way to start training your RAM variant is to edit the parameters in config.py and run the following command:

python train.py

References

About

Pytorch implementation of fine grained visual attention in bilinear CNN

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages