Skip to content

Commit

Permalink
add DL for coders part two to courses
Browse files Browse the repository at this point in the history
  • Loading branch information
mahdihaghverdi committed Apr 5, 2023
1 parent 7d3920d commit 5acb8c8
Show file tree
Hide file tree
Showing 3 changed files with 20 additions and 1 deletion.
21 changes: 20 additions & 1 deletion planner/courses.tex
Original file line number Diff line number Diff line change
Expand Up @@ -79,4 +79,23 @@ \chapter{Courses}
\end{itemize}



\clearpage
\section{From Deep Learning Foundations to Stable Diffusion}

\para{Three years ago we pioneered Deep Learning from the Foundations, an in depth course that started right from the foundations—implementing and GPU-optimising matrix multiplications and initialisations—and covered from scratch implementations of all the key applications of the fastai library.}

\para{This year, we’re going \say{from the foundations} again, but this time we’re going further. \textbf{Much} further! This time, we’re going all the way through to implementing the astounding Stable Diffusion\footnote{\url{https://stability.ai/blog/stable-diffusion-public-release}} algorithm. That’s the killer app\footnote{\url{https://www.pcworld.com/article/916785/creating-ai-art-local-pc-stable-diffusion.html}} that made the internet freak out\footnote{\url{https://devops.com/stable-diffusion-public-richixbw/}}, and caused the media to say\footnote{\url{https://arstechnica.com/information-technology/2022/09/with-stable-diffusion-you-may-never-believe-what-you-see-online-again/}} \say{you may never believe what you see online again}.}

\para{Stable diffusion, and diffusion methods in general, are a great learning goal for many reasons. For one thing, of course, you can create amazing stuff with these algorithms! To really take the technique to the next level, and create things that no-one has seen before, you need to really deeply understand what’s happening under the hood. With this understanding, you can craft your own loss functions, initialization methods, multi-model mixups, and more, to create totally new applications that have never been seen before.}

\para{Just as important: it’s a great learning goal because nearly every key technique in modern deep learning comes together in these methods. Contrastive learning, transformer models, auto-encoders, CLIP embeddings, latent variables, u-nets, resnets, and much more are involved in creating a single image.}

\para{This is all cutting-edge stuff, so to ensure we bring the latest techniques to you, we’re teaming up with the folks that brought stable diffusion to the world: \url{stability.ai}. stability.ai are, in many ways, kindred spirits to fast.ai. They are, like us, a self-funded research lab. And like us, their focus is smashing down any gates that make cutting edge AI less accessible. So it makes sense for us to team up on this audacious goal of teaching stable diffusion from the foundations.}

\para{The course will be available for free online from early 2023. But if you want to join the course right as it’s made, along with hundreds of the world’s leading deep learning practitioners, then you can register to join the virtual live course\footnote{\url{https://itee.uq.edu.au/event/2022/practical-deep-learning-coders-uq-fastai-part-2}} through our official academic partner, the University of Queensland (UQ). UQ will have registrations open in the next few days, so keep an eye on the link above.}

\para{During the live course, we’ll be learning to read and implement the latest papers, with lots of opportunity to practice and get feedback. Many past participants in fast.ai’s live courses have described it as a “life changing” experience… and it’s our sincere hope that this course will be our best ever.}

\para{To get the most out of this course, you should be a reasonably confident deep learning practitioner. If you’ve finished fast.ai’s Practical Deep Learning course\footnote{text\url{https://course.fast.ai/}} then you’ll be ready! If you haven’t done that course, but are comfortable with building an SGD training loop from scratch in Python, being competitive in Kaggle competitions, using modern NLP and computer vision algorithms for practical problems, and working with PyTorch and fastai, then you will be ready to start the course. (If you’re not sure, then I strongly recommend getting starting with Practical Deep Learning now—if you push, you’ll be done before the new course starts!)}

\para{If you’re an alumnus of Deep Learning for Coders, you’ll know that course sets you up to be an effective deep learning practitioner. This new course will take you to the next level, creating novel applications that bring multiple techniques together, and understanding and implementing research papers. Alumni of previous versions of fast.ai’s “part 2” courses have even gone on to publish deep learning papers in top conferences and journals, and have joined highly regarded research labs and startups.}
Binary file added planner/images/diffusion.webp
Binary file not shown.
Binary file modified planner/planner.pdf
Binary file not shown.

0 comments on commit 5acb8c8

Please sign in to comment.