Skip to content

📚🍊 Implement DNN or ML models and advanced policies with PyTorch.(Include experiment)

License

Notifications You must be signed in to change notification settings

AutuanLiu/PyTorch-ML

Repository files navigation

PyTorch-DNN

Implement DNN models and advanced policy with PyTorch.

Requirements

  1. torch >= 0.4.0
  2. torchvision >= 0.2.1

Content

  1. Cyclical Learning Rates
optimizer = optim.Adam(model.parameters(), lr=1.)
# initial lr should be 1
clr = cyclical_lr(step_size, min_lr=0.001, max_lr=1, scale_func=clr_func, scale_md='iterations')
scheduler = lr_scheduler.LambdaLR(optimizer, [clr])
  1. SGDR(has been committed to PyTorch)
torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=100, eta_min=1e-8, T_mult=2)
# T_max < training epochs if you want to use restart policy
  1. An abstract class for establish network
from models.BaseNet_class import BaseNet
# some configs setting
configs = {
    'model': net,
    'opt': opt,
    'criterion': nn.CrossEntropyLoss(),
    'dataloaders': ...,
    'data_sz': ...,
    'lrs_decay': lr_scheduler.StepLR(opt, step_size=50),
    'prt_freq': 5,
    'epochs': 500,
}
sub_model = BaseNet(configs)
# train and test
sub_model.train_m()
sub_model.test_m()

CNN

  • ResNet
  • AlexNet
  • GoogLeNet
  • DenseNet
  • VGGNet
  • LeNet
  • GAN
  • NiN
  • STN
  • VAE

RNN

Related papers

  1. [1608.03983] SGDR: Stochastic Gradient Descent with Warm Restarts
  2. [1506.01186] Cyclical Learning Rates for Training Neural Networks
  3. [1704.00109] Snapshot Ensembles: Train 1, get M for free

Related references

  1. Another data science student's blog
  2. 动手学深度学习 文档
  3. Understanding LSTM and its diagrams
  4. 吴良超的学习笔记

Releases

No releases published

Sponsor this project

Packages