Skip to content

Resources for the Deep Learning study (2018 Winter) with DeepLearningZeroToAll.

License

Notifications You must be signed in to change notification settings

CCS-Lab/workshop_2018winter_deepLearning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

39 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

README

This repository is to study Deep Learning with people in CCS-Lab, in 2018 Winter (January - February 2018). All the materials in the repository can be used in addition to the Sung Kim's lectures and its codes.

Week 1 (2018-01-15)

Week 2 (2018-01-22)

  • Lectures
    • Logistic (Regression) Classification
      • Hypothesis 함수 소개
      • Cost 함수 소개
      • TensorFlow에서의 구현
    • Softmax Regression (Multinomial Logistic Regression)
      • Multinomial 개념 소개
      • Cost 함수 소개
      • Lab 1: TensorFlow에서의 구현
      • Lab 2: TensorFlow에서의 Fancy한 구현
  • Materials

Week 3 (2018-01-29)

  • Lectures
    • ML의 실용과 몇가지 팁
      • 학습 rate, Overfitting, 그리고 일반화 (Regularization)
      • Training/Testing 데이타 셋
      • Lab 1: TensorFlow에서의 구현 (학습 rate, training/test 셋으로 성능평가)
      • Lab 2: Meet MNIST dataset
    • 딥러닝의 기본 개념과, 문제, 그리고 해결
      • 딥러닝의 기본 개념: 시작과 XOR 문제
      • 딥러닝의 기본 개념2: Back-propagation과 2006/2007 '딥'의 출현
      • Lab: Tensor Manipulation
  • Materials

Week 4 (2018-02-13)

  • Lectures
    • Neural Network 1: XOR 문제와 학습방법, Backpropagation (1986 breakthrough)
      • XOR 문제 딥러닝으로 풀기
      • 특별편: 10분 안에 미분 정리하기
      • Deep Network 학습시키기 (backpropagation)
      • Lab 1: XOR을 위한 TensorFlow Deep Network
      • Lab 2: TensorBoard로 Deep Network 들여다보기
    • Neural Network 2: ReLU and 초기값 정하기 (2006/2007 breakthrough)
      • XSigmoid보다 ReLU가 더 좋아
      • Weight 초기화 잘해보자
      • Dropout과 앙상블
      • 레고처럼 넷트웍 모듈을 마음껏 쌓아 보자
      • Lab: 딥러닝으로 MNIST 98% 이상 해보기

Week 5 (2018-02-19)

  • Lectures
    • Convolutional Neural Networks
      • ConvNet의 Conv 레이어 만들기
      • ConvNet Max pooling 과 Full Network
      • ConvNet의 활용 예
      • Lab 1: TensorFlow CNN의 기본
      • Lab 2: TensorFlow로 구현하자 (MNIST 99%)
      • Lab 3: Class, tf.layers, Ensemble (MNIST 99.5%)

Week 6 (2018-02-26)

  • Lectures
    • Recurrent Neural Networks
      • NN의 꽃 RNN 이야기
      • Lab 1: RNN의 기본
      • Lab 2: Hi Hello RNN Traning
      • Lab 3: Long Sequence RNN
      • Lab 4: Stacked RNN + Softmax Layer
      • Lab 5: Dynamic RNN
      • Lab 6: Time Series RNN

Useful Resources

About

Resources for the Deep Learning study (2018 Winter) with DeepLearningZeroToAll.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages