[ICCV 2021]Demo for the bias loss and SkipblockNet architecture presented in the paper "Bias Loss for Mobile Neural Networks".
for installing required packages run
pip install -r requirements.txt
Pretrained SkipblockNet-m is available from Google Drive. For the testing please download and place the model in the same directory as the validation script.
python validate.py --data path/to/the/dataset
Training and testing codes are available for DenseNet121, ShuffleNet V2 0.5x and ResNet18. To test the pretrained models please download corresponding model from the Google Drive and run the testing script in the bias loss directory
python test.py --checkpoint 'path to the checkpoint' --model 'name of the model' --data_path 'path to the cifar-100 dataset'
To train the models run the training script in the bias loss directory as follows:
python train.py --model 'name of the model to be trained' --data_path 'path to the cifar-100 dataset'
"Bias Loss for Mobile Neural Networks"
By Lusine Abrahamyan, Valentin Ziatchin, Yiming Chen and Nikos Deligiannis.
SkipNet beats other SOTA lightweight CNNs such as MobileNetV3 and FBNet.
The bias loss is a dynamically scaled cross-entropy loss, where the scale decays as the variance of data point decreases.
Bellow is the results of the pretrained models that can be found in the Google Drive
Model | Top-1 bias loss | Top-1 CE |
---|---|---|
ResNet18 | 75.51% | 74.33% |
DenseNet121 | 77.83% | 75.98% |
ShuffleNet V2 0.5x | 72.00% | 71.55% |
If you find the code useful for your research, please consider citing our works
@inproceedings{abrahamyan2021bias,
title={Bias Loss for Mobile Neural Networks},
author={Abrahamyan, Lusine and Ziatchin, Valentin and Chen, Yiming and Deligiannis, Nikos},
booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
pages={6556--6566},
year={2021}
}
Codes is heavily modified from pytorch-vision and pytorch-cifar100.