Skip to content

Sample implementations of Privacy Preserving DL from Literature

Notifications You must be signed in to change notification settings

debjyoti0891/privacypreDL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Sample implementations of Privacy Preserving DL from Literature

Getting started with PyTorch

mnist.py uses a CNN for classification of the MNIST dataset. It also includes details of saving network state to file and reloading. The code is based on the following tutorials.

dssgd.py implements Distributed Selective Stochastic Gradient Descent implemented on the MNIST dataset.

Currently supported parameters are as follows :-

  1. trainers : number of participants in DSSGD
  2. partition_ratio : part of data sets available to individual participant.
  3. n_parts : number of times global update happens per epoch
  4. theta : fraction to total parameters updated to global state
  5. n_epochs : number of epochs used for training The usual SGD parameters are also present : batch_size_train, batch_size_test, learning_rate, momentum and
    log_interval.

Create a folder results in the same directory to store the trained models and optimizers.

About

Sample implementations of Privacy Preserving DL from Literature

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages