Final project for CS590.
This project seeks to explore the relationship between dropout as an uncertainty measure as explored in the work of Yarin Gal and Zoubin Ghahramani. link We run a membership inference attack against trained models and assess how Dropout and Differential Privacy interoperate to protect training set data.
In order to run the experiment, install the reqiured libraries.
- Run
model.py
to train the required target models. - Run
attack.py
to attack the models.
Note that it is easy to monitor performance of the models during training and inference time by using Tensorboard