Skip to content

[ECCV-2024] Transferable Targeted Adversarial Attack, CLIP models, Generative adversarial network, Multi-target attacks

Notifications You must be signed in to change notification settings

ffhibnese/CGNC_Targeted_Adversarial_Attacks

Repository files navigation

CLIP-Guided Generative Networks for Transferable Targeted Adversarial Attacks

[ECCV-2024] A PyTorch official implementation for CLIP-Guided Generative Networks for Transferable Targeted Adversarial Attacks, accepted to ECCV-2024.

Hao Fang*, Jiawei Kong*, Bin Chen#, Tao Dai, Hao Wu, Shu-Tao Xia

pipeline

Results

Setup

We provide the environment configuration file exported by Anaconda, which can help you build up conveniently.

conda env create -f environment.yml
conda activate CGNC

Running commands

Training

  • Download the ImageNet training set.

  • Below we provide running commands for training the CLIP-guided generator based on 8 different target classes from a previous same setting.

python train.py --train_dir $DATA_PATH/ImageNet/train --model_type incv3 --start_epoch 0 --epochs 10 --label_flag 'N8'

Download pretrained generators CGNC-Incv3 and CGNC-Res152 based on the setting of 8 different classes.

Single-target Masked Finetuning

Below we provide running commands for finetuning the CLIP-guided generator based on one single class if necessary (take class id 150 as example).

python train.py --train_dir $DATA_DIR/ImageNet/train --model_type incv3 --start_epoch 10 --epochs 15 --label_flag 'N8' --load_path $CKPT_DIR/incv3/model-9.pth --finetune --finetune_class 150

Generating adversarial examples

Below we provide running commands for generating targeted adversarial examples on ImageNet NeurIPS validation set (1k images) under our multi-class setting:

python eval.py --data_dir data/ImageNet1k/ --model_type incv3 --load_path $SAVE_CHECKPOINT --save_dir ADV_DIR

Below we provide running commands for generating targeted adversarial examples on ImageNet NeurIPS validation set (1k images) under our single-class setting (take class id 150 as example):

python eval.py --data_dir data/ImageNet1k/ --model_type incv3 --load_path $SAVE_CHECKPOINT --save_dir $IMAGES_DIR --finetune --finetune_class 150

Testing

The above crafted targeted adversarial examples can be directly used for testing different models in torchvision.

Below we provide running commands for testing our method against different black-box models:

python inference.py --test_dir $IMAGES_DIR --model_t vgg16

Cite

@article{fang2024clip,
  title={CLIP-Guided Networks for Transferable Targeted Attacks},
  author={Fang, Hao and Kong, Jiawei and Chen, Bin and Dai, Tao and Wu, Hao and Xia, Shu-Tao},
  journal={arXiv preprint arXiv:2407.10179},
  year={2024}
}

About

[ECCV-2024] Transferable Targeted Adversarial Attack, CLIP models, Generative adversarial network, Multi-target attacks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages