Skip to content

SPIN-UMass/Diffence

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

45 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Diffence: Fencing Membership Privacy With Diffusion Models

This repository contains the code for the paper "Diffence: Fencing Membership Privacy With Diffusion Models," accepted by NDSS 2025.

Diffence is a robust plug-and-play defense mechanism designed to enhance the membership privacy of both undefended models and models trained with state-of-the-art defenses, without compromising model utility.

Table of Contents

Installation

  1. Clone the repository:

    git clone https://github.com/bujuef/Diffence.git
    cd Diffence
  2. Create a conda environment and install dependencies:

    conda env create -f environment.yaml
    conda activate diffence-env
  3. If you do not have conda installed, follow the instructions on their official documentation.

Experiment Workflow

Preparation

  1. Navigate to the folder of the dataset to be tested, e.g., CIFAR-10:

    cd cifar10
  2. Download and partition the dataset:

    python data_partition.py
  3. Obtain the diffusion model used for Diffence:

    We provide our pretrained diffusion model checkpoints here. Copy the diff_models to the corresponding diff_defense folder, e.g., cifar10/diff_defense/diff_models.

    (Optional) Train the diffusion model using this repository.

Execution

  1. Obtain the undefended model and models with existing defenses:

    Our pretrained models are available here. Copy them to the final-all-models folder, e.g., cifar10/final-all-models/resnet/selena.pth.tar.

    (Optional) You can retrain specific defended models using the commands listed in all-train-all.sh.

  2. Test model accuracy and membership privacy:

    cd evaluate_MIAs  # Navigate to the test script folder
    bash evaluate_mia.sh --defense [defense name]  # defense name in {undefended, selena, advreg, hamp, relaxloss}

After completion, the results of the above experiments will be saved in the ./results folder.

Results

The results will be saved in Diffence/[dataset_name]/evaluate_MIAs/results. For example, selena and selena_w_diffence correspond to the results of using SELENA defense alone and deploying Diffence on top of it, respectively.

Acknowledgments

The implementation of Diffence builds upon code from the following repositories:

We greatly appreciate the contributions from these works.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published