Skip to content

dustinstansbury/medal

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

-------------------------------------------------------------------------------
Matlab Environment for Deep Architecture Learning (MEDAL) - version 0.1
-------------------------------------------------------------------------------

   o   o
  / \ / \ EDAL
 o   o   o

Model Objects:
	mlnn.m        -- Multi-layer neural network
	mlcnn.m       -- Multi-layer convolutional neural network
	rbm.m         -- Restricted Boltzmann machine (RBM)
	mcrbm.m       -- Mean-covariance (3-way Factored) RBM
	drbm.m        -- Dynamic/conditional RBM 
	dbn.m         -- Deep Belief Network 
	crbm.m        -- Convolutional RBM
	ae.m          -- Shallow autoencoder 
	dae.m         -- Deep Autoencoder 
	
-------------------------------------------------------------------------------
To begin type:

>> startLearning

in the medal directory

To get an idea of how the model objects work, check out the demo script:

>> deepLearningExamples('all')

These examples are by no means optimized, but are for getting familiar with 
the code.If you have any questions or bugs, send them my way:
 
[email protected]
-------------------------------------------------------------------------------
References:

*Neural Networks/Backpropagations:
 Rumelhart, D. et al. "Learning representations by back-propagating errors".
 Nature 323 (6088): 533–536. 1986.

*Restricted Boltzmann Machines/Contrastive Divergence
 Hinton, G. E. "Training Products of Experts by Minimizing Contrastive
 Divergence". Neural Computation 14 (8): 1771–1800. 2002

*Deep Belief Networks:
 Bengio, Y., Lamblin, P., Popovici, P., Larochelle, H. "Greedy Layer-Wise
 Training of Deep Networks" NIPS 2006

*Deep & Denoising Autoencoders
 Hinton, G. E. and Salakhutdinov, R. R "Reducing the dimensionality of data with
 neural networks." Science, Vol. 313. no. 5786, pp. 504 - 507, 28 July 2006.

*Pascal, V. et al. “Stacked denoising autoencoders: Learning useful
 representations in a deep network with a local denoising criterion.“ The
 Journal of Machine Learning Research 11:3371-3408. 2010

*Mean-Covariance/3-way Factored RBMs:
 Ranzato M. et al. "Modeling Pixel Means and Covariances Using
 Factorized Third-Order Boltzmann Machines." CVPR 2012.

*Dynamic/Conditional RBMs:
 Taylor G. et al. "Modeling Human Motion Using Binary Latent
 Variables" NIPS 2006.

*Convolutional MLNNs:
 LeCun, Y., et al. "Gradient-based learning applied to document recognition".
 Proceedings of the IEEE, 86(11), 2278–2324. 2008

 Krizhevsky, A et al. "ImageNet Classification with Deep Convolutional Neural
 Networks." NIPS 2012.

*Convolutional RBMs:
 Lee, H. et al. “Convolutional deep belief networks for scalable unsupervised
 learning of hierarchical representations.”, ICML 2009

*Rectified Linear Units
 Nair V., Hinton GE. (2010) Rectified Linear Units Improve Restricted Boltzmann Machines. IMCL 2010.

 Glorot, X. Bordes A. & Bengio Y. (2011). "Deep sparse rectifier neural 
 networks". AISTATS 2011.

*Dropout Regularization:
 Hinton GE et al. Technical Report, Univ. of Toronto, 2012.
 
*General
 Hinton, G. E. "A practical guide to training restricted Boltzmann machines"
 Technical Report, Univ. of Toronto, 2010.
-------------------------------------------------------------------------

About

Matlab Environment for Deep Architecture Learning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published