I want to go through this course and will publish my progress here. Hope this process will help me to build consistency and momentum going through whole course.
- Q1: k-Nearest Neighbor classifier (20 points) 17 Dec 2017
- Q2: Training a Support Vector Machine (25 points) 19 Dec 2017 | 20 Dec 2017
- Q3: Implement a Softmax classifier (20 points) 23 Dec 2017 | 24 Dec 2017
- Q4: Two-Layer Neural Network (25 points) 26 Dec 2017
- Q5: Higher Level Representations: Image Features (10 points) 26 Dec 2017
- Q6: Cool Bonus: Do something extra! (+10 points)
- Recap: SVM Vectorizing Gradients
- Q1: Fully-connected Neural Network (25 points) 27 Dec 2017
- Q2: Batch Normalization (25 points) 29 Dec 2017
- Q2A: Implement Batch Normalization alternative gradient
- Q3: Dropout (10 points) 27 Dec 2017
- Q4: Convolutional Networks (30 points)31 Dec 2017
- Q5: PyTorch / TensorFlow on CIFAR-10 (10 points)02 Jan 2018
- Q6: Do something extra! (up to +10 points)
- Q1: Image Captioning with Vanilla RNNs (25 points)06 Jan 2018
- Q2: Image Captioning with LSTMs (30 points)06 Jan 2018
- Q2A: Implement good captioning model
- Q3: Network Visualization: Saliency maps, Class Visualization, and Fooling Images (15 points) 08 Jan 2018
- Q4: Style Transfer (15 points) 13 Jan 2018
- Q5: Generative Adversarial Networks (15 points)
TBA Later
Done with style transfer, interesting assignment it was a lot of fun to do this. It was quite complicated, but I manage to do it. Finished GAN. It was fucking amazing.
Very hard at first task, I had no idea how to use PyTorch for this task, and it was almost no explicit instructions, so I just struggled a bit. Than I've groked this. Bad that I used other people solution to get idea what to do. (Especially how to pass gradients, and what to do next)
Spend to days watching video lectures on CS231n, finally got to RNNs today and can complete assignments. Good that I sit and implement them easily. I feels bad, that I can do assignment so easily, it must be harder, thanks for a lot of help functions from creators of course.
Implementing LSTMs was pretty hard, I spent quite a lot of time even have drawn computation graph. Also when I was ready to give up and look for solutions in Internet. I'm like "Ok, one more time and I'm done"! Then "click" and everything works! That's just fucking magic! So I really implemented Everything I need to build LSTMs.
Ho-ho easily implemented architecture, it nicely overfit, but works well on data. And gets 75% accuracy.
Woof! I did it! I've completed this assignment with convolutional networks it was pretty interesting. Working with 3d slices of data and others. What should I do more It's fast implementation of this networks. Everything works pretty cool!
Finally did this task, it was hard, I spent a lot of time. Main reason was lack of focus and attention. Checking of shapes help a lot, but I struggled with final gradient of gate "–" And I used external resources.
Implemented Fully-connected Neural Network with arbitary architecture. Now things get quite easy, I understand how to go through this course, so I just do assignments. Implemented Dropout. The bad: I skipped descriptions of different optimizations techniques.
Good that I figured out how to implement neural net, and in general completed first assignment. Bad, I worked with low focus in the morning
GOOD:
- implemented cross validation and SGD
BAD:
- spent to much time on vectorizing Gradients This guide helped me alot
GOOD:
- I succesfully figured out how to correctly calculate svm gradients. Optimization course page helped me a lot! I almost figured out it by my self, but missed indicator function they use
- First time in my life I understood how to write vectorized versions, It's very simple. You just copy-paste naive implementation and vectorize it step-by-step. One cycle per time. Loss vectorizing took only 25 minutes
BAD:
- I haven't out indicator function in gradients
- I do not had enough time to implement vectorized gradients
So I added this repo and implemented k-Nearest Classifier. During this task I noticed few interesting things.
- Half vectorized implementation work worse than 2 lops, probably it's because I do non efficient norm calculation
- Vectorizing of L2 normalization do not seemed straightforward at first, but simple rule
(a-b)^2 = a^2 - 2*a*b + b^2
helps a lot. I had really to play to calculate correct square of matrix. But it was fun. - I got better at mutation matrixes with
reshape
,hstack
,vstack
. All this stuff just looked obvious for me today.
GOOD:
- easier than before to go through task, I'm like nailed it.
- I did everything by myself
BAD:
- I had to Google for simple hint about L2 vectorizing. It's quite obvious and I should use basic math rules.
So tommorow I'm going to close SVM task