Skip to content

A fresh approach to deep learning written in MATLAB

License

Notifications You must be signed in to change notification settings

lxutn/Meganet.m

 
 

Repository files navigation

Meganet.m

A fresh approach to deep learning written in MATLAB

Reporting Bugs

We are just getting started, so please be patient with us. If you find a bug, please report it by opening an issue or email [email protected]. In any case, include a small example that helps us re-produce the error. We'll work on this as quickly as possible.

Getting started

  1. Clone or download the code
  2. Add folder to your MATLAB path
  3. (optional) run KernelTypes/mexcuda/make_cuda.m for fast CNNs using CuDNN
  4. (optional) gather test data or binary files

Optional Binary Files

The convMCN kernel type and the average pooling require compiled binaries from the MatConvNet package. Please follow these instructions and add the files for vl_nnconv, vl_nnconvt, and vl_nnpool to your MATLAB path.

For best performance these files can be compiled with GPU or CuDNN support.

Additional Test Data

Some examples use these benchmark data

  1. MNIST
  2. CIFAR10
  3. STL-10

References

The implementation is based on the ideas presented in:

  1. Haber E, Ruthotto L: Stable Architectures for Deep Neural Networks, Inverse Problems, 2017
  2. Chang B, Meng L, Haber E, Ruthotto L, Begert D, Holtham E: Reversible Architectures for Arbitrarily Deep Residual Neural Networks, AAAI Conference on Artificial Intelligence 2018
  3. Haber E, Ruthotto L, Holtham E, Jun SH: Learning across scales - A multiscale method for Convolution Neural Networks, AAAI Conference on Artificial Intelligence 2018

About

A fresh approach to deep learning written in MATLAB

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • MATLAB 96.0%
  • Cuda 3.8%
  • C 0.2%