Must-read Papers on Textual Adversarial Attack and Defense
-
Updated
Nov 4, 2024 - Python
Must-read Papers on Textual Adversarial Attack and Defense
auto_LiRPA: An Automatic Linear Relaxation based Perturbation Analysis Library for Neural Networks and General Computational Graphs
Code for our NeurIPS 2019 *spotlight* "Provably Robust Deep Learning via Adversarially Trained Smoothed Classifiers"
A curated list of papers on adversarial machine learning (adversarial examples and defense methods).
A list of awesome resources for adversarial attack and defense method in deep learning
This repository contains the implementation of three adversarial example attack methods FGSM, IFGSM, MI-FGSM and one Distillation as defense against all attacks using MNIST dataset.
[ICML 2024] Unsupervised Adversarial Fine-Tuning of Vision Embeddings for Robust Large Vision-Language Models
Provably defending pretrained classifiers including the Azure, Google, AWS, and Clarifai APIs
CVPR 2022 Workshop Robust Classification
Certified defense to adversarial examples using CROWN and IBP. Also includes GPU implementation of CROWN verification algorithm (in PyTorch).
[ICLR 2021] "InfoBERT: Improving Robustness of Language Models from An Information Theoretic Perspective" by Boxin Wang, Shuohang Wang, Yu Cheng, Zhe Gan, Ruoxi Jia, Bo Li, Jingjing Liu
Adversarial attacks on Deep Reinforcement Learning (RL)
Adversarial Distributional Training (NeurIPS 2020)
😎 A curated list of awesome real-world adversarial examples resources
Machine Learning Attack Series
pytorch implementation of Parametric Noise Injection for adversarial defense
Code for the paper: Adversarial Training Against Location-Optimized Adversarial Patches. ECCV-W 2020.
This repository provide the studies on the security of language models for code (CodeLMs).
Learnable Boundary Guided Adversarial Training (ICCV2021)
[IEEE TIP 2021] Self-Attention Context Network: Addressing the Threat of Adversarial Attacks for Hyperspectral Image Classification
Add a description, image, and links to the adversarial-defense topic page so that developers can more easily learn about it.
To associate your repository with the adversarial-defense topic, visit your repo's landing page and select "manage topics."