This repository contains the training code of Quantization Networks introduced in the CVPR 2019 paper: Quantization Networks.
Our implementation is a modified version of the original implementation. The main changes are:
- Support of CIFAR-10 and MNIST dataset, besides ImageNet.
- A unified training script for full-precision, weight quantization and activation quantization.
- Training curve visualization using Tensorboard.
- Clustering before quantization training, needless of manual clustering.
- Freezing of BN layer during activation quantization training (idea borrowed from FQN).
- Removal of outlier in clustering using 3-sigma rule (also borrowed from FQN).
First, please create a compatible Python environment.
Here are our environment configurations during development.
- Ubuntu 18.04
- Python 3.7
- opencv-python
- numpy 1.17.4
- pytorch 1.3.0
- torchvision 0.4.2
- Tensorboard 2.0.0
- argparse 1.1
- logging 0.5.1.2
Please refer to PREPARE_DATA.md.
Please refer to GET_STARTED.md.
Please refer to EXPERIMENTS.md.
This repository is forked from aliyun/alibabacloud-quantization-networks and keep its Apache 2.0 license.
Please cite the paper if it helps your research:
@inproceedings{yang2019quantization,
title={Quantization Networks},
author={Yang Jiwei, Shen Xu, Xing Jun, Tian Xinmei, Li Houqiang, Deng Bing, Huang Jianqiang and Hua Xian-sheng},
booktitle={CVPR},
year={2019}
}