Hierarchical Multi-Granularity Classification based on Residual Gated GCN
In this paper, we investigate hierarchical multi-granularity classification with training samples labeled at different levels. Accordingly, we propose the combinatorial loss and a hierarchical residual network (HRN) for hierarchical feature interaction. This repo implements an alternative hierarchical feature interaction network (HMGN) based on the same combinatorial loss.
- The trunk net (ResNet-50) produces the feature map;
- Pre-trained GloVe model generates word vectors for each class in the hierarchy;
- Word vectors interact with the feature map using a low-rank bilinear pooling method to generate semantic guided attention coefficients;
- We perform weighted average pooling over all locations in the feature map to obtain the initial feature vector for each node in the graph;
- In the graph, each node represents a class in the hierarchy, and children nodes connect to their parents with undirected edges;
- We adopt the residual gated GCN to perform feature interaction between nodes;
- The node classifier performs element-wise multiplication with the matrix formed by the feature vectors of each node. We perform average pooling for each class on the results to output the final vector used for classification.
- Python 3.7
- Pytorch 1.3.1
- torchvision 0.4.2
- networkx 2.3
- CUDA 10.2
We perform experiments on CUB-200-2011 and compare two different hierarhical networks with the same combinatorial loss. Supporting files can be found in the related repo: HRN.
OA (%) results on CUB-200-2011 with the relabeling proportion 0% by comparing two hierarhical networks:
Levels | HRN | HMGN |
---|---|---|
Orders | 98.67 | 98.74 |
Families | 95.51 | 95.27 |
Species | 86.60 | 85.27 |