Skip to content

denis725/lasagnemould

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 

Repository files navigation

lasagnemould

A hack for less verbose initialization of nolearn.lasagne neural nets.

To use this, import layers from lasagnemould instead of lasagne and initialize your layers like this:

from lasagnemould import layers

mylayers = [
    layers.InputLayer(shape=(None, 784)),
    layers.DenseLayer(100),
    layers.DenseLayer(10, nonlinearity=softmax)
]

The advantage of this is that you can directly instantiate the layers -- including the use of *args -- instead of using factories and without the need to specify the incoming keyword.

See here for how this initialization differs from the default initialization.

About

A Mould for Lasagne

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published