The first assignment consisted of four parts:
- Implementing a MLP in NumPy from the ground up by deriving forward and backward pass on paper and translating it to code
- Implementing the same MLP with PyTorch
- Writing a custom Layer Normalization module with manual forward and backward pass
- Implement a VGG network architecture and compare to Transfer Learning approach
The second assignment consisted of three parts:
- Implement Long-Short Term Networks (LSTM) as well as Bi-directional LSTM from scratch and compare their performance on a simple sequence dataset
- Use built-in PyTorch LSTM module for text generation
- Theoretical questions about Graph Neural Networks
The third assignment consisted of three parts:
- Implement a Variational Auto Encoder
- Implement a Generative Adverserial Network
- Build a Generative Flow Based Model
Copyright © 2020 Nils Lehmann.
This project is distributed under the MIT license. In case you are a UvA student, please follow the UvA regulations governing Fraud and Plagiarism