The core functions are implemented in the caffe framework. We use matlab interfaces matcaffe for data preparation.
-
Clone our repository and the submodule: Simply copy and execute following commands in the command line
git clone git@github.com:XinshaoAmosWang/Ranked-List-Loss-for-D eep-Metric-Learning.git cd Ranked-List-Loss-for-Deep-Metric-Learning/ git submodule add git@github.com:sciencefans/CaffeMex_v2.git git submodule init git submodule update git submodule update --remote --merge
-
Put the files of new layers to the corresponding directories of submodule CaffeMex_v2
cp New_Layers_by_XinshaoAmosWang/*.cpp CaffeMex_v2/src/caffe/layers/ cp New_Layers_by_XinshaoAmosWang/*.hpp CaffeMex_v2/include/caffe/layers/ cp New_Layers_by_XinshaoAmosWang/caffe.proto CaffeMex_v2/src/caffe/proto/ cp New_Layers_by_XinshaoAmosWang/Makefile.config CaffeMex_v2/
-
Install dependencies on Ubuntu 16.04
sudo apt-get install libprotobuf-dev libleveldb-dev libsnappy-dev libopencv-dev libhdf5-serial-dev protobuf-compiler sudo apt-get install --no-install-recommends libboost-all-dev sudo apt-get install libopenblas-dev sudo apt-get install python-dev sudo apt-get install libgflags-dev libgoogle-glog-dev liblmdb-dev
-
Install MATLAB 2017b
Download and Run the install binary file
./install
-
Compile Caffe and matlab interface
Note you may need to change some paths in Makefile.config according your system environment and MATLAB path
cd CaffeMex_v2 make -j8 && make matcaffe
Examples for reproducing our results on Stanford Online Product dataset are given.
-
Data preparation for SOP
Downlaod Stanford_Online_Products dataset from ftp://cs.stanford.edu/cs/cvgl/Stanford_Online_Products.zip
For simplicity, you can use the data mat file in pre_post_process directory, which is ready training and testing scripts. To solve the data path, you can do eithor a or b:
a. Changing the path within the mat files. b. A Simpler way: Create a soft link of your data e.g sudo ln -s /.../Stanford_Online_Products /home/xinshao/Papers_Projects/Data/Stanford_Online_Products
-
Custom data preparation
You only need to create training/testing mat files with the same structure as SOP_TrainImagePathBoxCell.mat and SOP_TestImagePathBoxCell.mat in directory SOP_GoogLeNet_Ori_V05/pre_pro_process.
e.g. SOP_TrainImagePathBoxCell.mat contains , TrainImagePathBoxCell storing all image paths and class_ids storing their corresponding semantic labels.
-
Train & Test
Run the training and testing scripts in the training folder of a specific setting defined by its corresponding prototxt folder.
You can use the test scripts to test the performance of our trained model in the directory Our_trained_models_on_SOP_T10_m12_pn04_iter_16000.
If you find our code and paper useful in your research, please kindly cite our paper:
InProceedings{Wang_2019_CVPR,
author = {Wang, Xinshao and Hua, Yang and Kodirov, Elyor and Hu, Guosheng and Garnier, Romain and Robertson, Neil M.},
title = {Ranked List Loss for Deep Metric Learning},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2019}
}
The overall objective is to make the postive set rank before the negative set by a distance margin. We do not need to consider the exact order of examples within the positive set and negative set.
- Sample mining;
- Sample weighting;
- Two distance hyper-parameters for optimisation and regularisation jointly;
- Exploiting a weighted combination of more data points.
Our work benefits from:
-
Hyun Oh Song, Yu Xiang, Stefanie Jegelka and Silvio Savarese. Deep Metric Learning via Lifted Structured Feature Embedding. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016. http://cvgl.stanford.edu/projects/lifted_struct/
-
CaffeMex_v2 library: https://github.com/sciencefans/CaffeMex_v2/tree/9bab8d2aaa2dbc448fd7123c98d225c680b066e4
-
Caffe library: https://caffe.berkeleyvision.org/
BSD 3-Clause "New" or "Revised" License
Affiliations:
- Queen's University Belfast, UK
- Anyvision Research Team, UK
Xinshao Wang (You can call me Amos as well) xwang39 at qub.ac.uk