Skip to content

Washington University (in St. Louis) Course T81-558: Applications of Deep Neural Networks

License

Notifications You must be signed in to change notification settings

dazionmau/t81_558_deep_learning

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

T81 558:Applications of Deep Neural Networks

Washington University in St. Louis

Instructor: Jeff Heaton

The content of this course changes as technology evolves, to keep up to date with changes follow me on GitHub.

Fall 2019, Mondays, Section 1: 2:30P-5:20P, Section 2: 6-9:00P, Online and in class room: TBA

Course Description

Deep learning is a group of exciting new technologies for neural networks. Through a combination of advanced training techniques and neural network architectural components, it is now possible to create neural networks that can handle tabular data, images, text, and audio as both input and output. Deep learning allows a neural network to learn hierarchies of information in a way that is like the function of the human brain. This course will introduce the student to classic neural network structures, Convolution Neural Networks (CNN), Long Short-Term Memory (LSTM), Gated Recurrent Neural Networks (GRU), General Adversarial Networks (GAN) and reinforcement learning. Application of these architectures to computer vision, time series, security, natural language processing (NLP), and data generation will be covered. High Performance Computing (HPC) aspects will demonstrate how deep learning can be leveraged both on graphical processing units (GPUs), as well as grids. Focus is primarily upon the application of deep learning to problems, with some introduction to mathematical foundations. Students will use the Python programming language to implement deep learning using Google TensorFlow and Keras. It is not necessary to know Python prior to this course; however, familiarity of at least one programming language is assumed. This course will be delivered in a hybrid format that includes both classroom and online instruction.

Objectives

  1. Explain how neural networks (deep and otherwise) compare to other machine learning models.
  2. Determine when a deep neural network would be a good choice for a particular problem.
  3. Demonstrate your understanding of the material through a final project uploaded to GitHub.

Syllabus

This syllabus presents the expected class schedule, due dates, and reading assignments. Download current syllabus.

Module Content
Module 1
Meet on 08/26/2019
  • Part 1.1: Course Overview
  • Part 1.2: Introduction to Python
  • Part 1.3: Python Lists, Dictionaries, Sets & JSON
  • Part 1.4: File Handling
  • Part 1.5: Functions, Lambdas, and Map/ReducePython Preliminaries
  • We will meet on campus this week! (first meeting)
Module 2
Week of 09/09/2019
  • Part 2.1: Introduction to Pandas for Deep Learning
  • Part 2.2: Encoding Categorical Values in Pandas
  • Part 2.3: Grouping, Sorting, and Shuffling
  • Part 2.4: Using Apply and Map in Pandas
  • Part 2.5: Feature Engineering in Padas
  • Module 1 Assignment Due: 09/10/2019
Module 3
Week of 09/16/2019
  • Part 3.1: Deep Learning and Neural Network Introduction
  • Part 3.2: Introduction to Tensorflow & Keras
  • Part 3.3: Saving and Loading a Keras Neural Network
  • Part 3.4: Early Stopping in Keras to Prevent Overfitting
  • Part 3.5: Extracting Keras Weights and Manual Neural Network Calculation
  • TensorFlow and Keras for Neural Networks
  • Module 2: Assignment due: 09/17/2019
Module 4
Week of 09/23/2019
  • Part 4.1: Encoding a Feature Vector for Keras Deep Learning
  • Part 4.2: Keras Multiclass Classification for Deep Neural Networks with ROC and AUC
  • Part 4.3: Keras Regression for Deep Neural Networks with RMSE
  • Part 4.4: Backpropagation, Nesterov Momentum, and ADAM Training
  • Part 4.5: Neural Network RMSE and Log Loss Error Calculation from Scratch
  • Module 3 Assignment due: 09/24/2019
Module 5
Meet on 09/30/2019
  • Part 5.1: Introduction to Regularization: Ridge and Lasso
  • Part 5.2: Using K-Fold Cross Validation with Keras
  • Part 5.3: Using L1 and L2 Regularization with Keras to Decrease Overfitting
  • Part 5.4: Drop Out for Keras to Decrease Overfitting
  • Part 5.5: Bootstrapping and Benchmarking Hyperparameters
  • Module 4 Assignment due: 10/01/2019
  • We will meet on campus this week! (2nd Meeting)
Module 6
Week of 10/07/2019
    • Part 6.1: Image Processing in Python
  • Part 6.2: Keras Neural Networks for MINST and Fashion MINST
  • Part 6.3: Implementing a ResNet in Keras
  • Part 6.4: Computer Vision with OpenCV
  • Part 6.5: Recognizing Multiple Images with Darknet
  • Module 5 Assignment due: 10/08/2019
Module 7
Week of 10/14/2019
  • Part 7.1: Introduction to GANS for Image and Data Generation
  • Part 7.2: Implementing a GAN in Keras
  • Part 7.3: Face Generation with StyleGAN and Python
  • Part 7.4: GANS for Semi-Supervised Learning in Keras
  • Part 7.5: An Overview of GAN Research
  • Module 6 Assignment due: 10/15/2019
Module 8
Meet on 10/21/2019
  • Part 8.1: Introduction to Kaggle
  • Part 8.2: Building Ensembles with Scikit-Learn and Keras
  • Part 8.3: How Should you Architect Your Keras Neural Network: Hyperparameters
  • Part 8.4: Bayesian Hyperparameter Optimization for Keras
  • Part 8.5: Current Semester's Kaggle
  • Module 7 Assignment due: 10/22/2019
  • We will meet on campus this week! (3rd Meeting)
Module 9
Week of 10/28/2019
  • Part 9.1: Introduction to Keras Transfer Learning
  • Part 9.2: Popular Pretrained Neural Networks for Keras.
  • Part 9.3: Transfer Learning for Computer Vision and Keras
  • Part 9.4: Transfer Learning for Languages and Keras
  • Part 9.5: Transfer Learning for Keras Feature Engineering
  • Regularization and Dropout
  • Module 8 Assignment due: 10/29/2019
Module 10
Week of 11/04/2019
  • Part 10.1: Time Series Data Encoding for Deep Learning, TensorFlow and Keras
  • Part 10.2: Programming LSTM with Keras and TensorFlow
  • Part 10.3: Image Captioning with Keras and TensorFlow
  • Part 10.4: Temporal CNN in Keras and TensorFlow
  • Part 10.5: Predicting the Stock Market with Keras and TensorFlow
  • Time Series and LSTM/GRU Networks
  • Module 9 Assignment due: 11/05/2019
Module 11
Week of 11/11/2019
Module 12
Meet on 11/18/2019
  • Security and Deep Learning
  • Kaggle Assignment due: 11/17/2019 (approx 4-6PM, due to Kaggle GMT timezone)
  • We will meet on campus this week! (4th Meeting)
Module 13
Week of 11/25/2019
    <
  • Advanced/New Deep Learning Topics
Module 14
Week of 12/02/2019
  • GPU, HPC and Cloud
  • Final Project due 12/09/2019

Datasets

  • Iris - Classify between 3 iris species.
  • Auto MPG - Regression to determine MPG.
  • WC Breast Cancer - Binary classification: malignant or benign.
  • toy1 - The toy1 dataset, regression for weights of geometric solids.

Note: Other datasets may be added as the class progresses.

Final Project

For the final project you can choose a security project or choose your own dataset to create and fit a neural network. For more information:

  • Security Project - See Canvas for more information.
  • Independent Project - Choose your own dataset or one of my suggestions.

Other Information

About

Washington University (in St. Louis) Course T81-558: Applications of Deep Neural Networks

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Languages

  • Jupyter Notebook 100.0%