This is an official pytorch implementation for (AGNAS: Attention-Guided Micro- and Macro-Architecture Search) by Zihao Sun, and et. al.
- Python 3.7.10
- PyTorch >= 1.1.0
CIFAR-10 can be automatically downloaded using this code by torchvision, and place them in the folder ./dataset/cifar10
ImageNet needs to be manually downloaded and following the instructions here.
To evaluate the AGNAS searched model on CIFAR-10
Pre-trained checkpoints are released in google drive. Download and place them in the ./Darts_Search_Space/eval_retrained_model, and run
cd Darts_Search_Space
python test.py
To search AGNAS model on CIFAR-10, run:
cd Darts_Search_Space
python train_search.py
To retrain AGNAS searched model on CIFAR-10, run:
cd Darts_Search_Space
python train.py
To evaluate the AGNAS searched model on ImageNet
Pre-trained checkpoints are released in google drive. Download and place them in the ./ProxylessNAS_Search_Space/retrain_architecture/eval_retrained_model, and run
cd ProxylessNAS_Search_Space/retrain_architecture
python valid.py
To search AGNAS model on ImageNet, run:
cd ProxylessNAS_Search_Space/attention_search
bash run_train.sh
To retrain AGNAS model on ImageNet, run:
cd ProxylessNAS_Search_Space/retrain_architecture
bash run_retrain.sh
Please cite our paper if you find anything helpful.
@inproceedings{sun2022agnas,
title={AGNAS: Attention-Guided Micro and Macro-Architecture Search},
author={Sun, Zihao and Hu, Yu and Lu, Shun and Yang, Longxing and Mei, Jilin and Han, Yinhe and Li, Xiaowei},
booktitle={International Conference on Machine Learning},
pages={20777--20789},
year={2022},
organization={PMLR}
}