Supervised by prof. Adi Akavia, Secure Cloud Computing Laboratory, Fall 2022-2023
In recent years, cloud computing has emerged as a widely adopted and cost-effective solution for storing and processing large volumes of data. However, one of the major challenges in secure cloud computing is the need to preserve the privacy of sensitive data while allowing for meaningful analysis.
In our project, we present a novel approach to address these by proposing a privacy-preserving distributed expectation-maximization algorithm for Gaussian mixture models. Utilizing fully homomorphic encryption, the proposed method enables centralized federated learning while preserving the privacy of sensitive data.
- Read the Background document, which provides an overview of the fundamental concepts behind Gaussian mixture models, the expectation maximization algorithm, and fully homomorphic encryption.
- View the 2D visualization in the Colab Notebooks.
- Read the Proposed Approach to understand our solution in detail and view the results.
- View the source code of GMM: EM vs. PPEM.
- Take a look at our final presentation.
- numpy
- matplotlib
- scipy
- tenseal