I try to use Hessian-free optimization to train the deep neural network. The training is based on the MNIST data set.
-
Updated
Nov 27, 2014 - MATLAB
Deep neural networks (DNNs) are a class of artificial neural networks (ANNs) that are deep in the sense that they have many layers of hidden units between the input and output layers. Deep neural networks are a type of deep learning, which is a type of machine learning. Deep neural networks are used in a variety of applications, including speech recognition, computer vision, and natural language processing. Deep neural networks are used in a variety of applications, including speech recognition, computer vision, and natural language processing.
I try to use Hessian-free optimization to train the deep neural network. The training is based on the MNIST data set.
GPU accelerated Deep Belief Network
Python Deep Neural Network ToolKit.
Implementation of Neural Networks in Theano for MNIST and AN4 dataset
Assignments for Special Topics in Signal Processing Course @ International Islamic University Islamabad, Spring 2016
Deep Neural Networks for Web Page Information Extraction
The project aimed to implement Deep NN / RNN based solution in order to develop flexible methods that are able to adaptively fillin, backfill, and predict time-series using a large number of heterogeneous training datasets.
Fortran Based Nueral Networks
Action recognition using soft attention based deep recurrent neural networks
Implementation of Latent Dirichlet Allocation using a Deep Learning Model
This is a STORN (Stochastical Recurrent Neural Network) implementation for keras!
Deep neural networks implemented in TensorFlow & Python for predicting whether transcription factors will bind to given DNA sequences
LeNet example
[IC-MII-UGR-2016-17] Neural Network with single hidden layer learning MNIST with less than 1.12% test error.
Learning generative distribution of handwritten digits
This is a Tensorflow implementation of the End-to-End Memory Network applied to Sequential Modelling of Facebook comments.