An example code of implement of PGD and FGSM algorithm for adversarial attack
The pretrained models is from here
Please download the pretrained models first and put them in the /cifar10_models/state_dicts as instruction in above link.
Please prepare your cifar-10 normal example and put all the classes in different folders. That is:
| imgs/
| - frog
| -- frog1.png frog2.png ......
...
| - automobile
...
$python3 main.py -I input_normal_examples_path -M model -T mode -O adversarial_examples_folder_name
model: vgg16_bn, resnet50, mobilenet_v2, densenet161
mode: PGD, FGSM
$python3 transferability.py -I input_normal_examples_path -O 1or0
O: if generate confusion table or not