Skip to content
Jianbo Ye edited this page Jun 11, 2015 · 1 revision

Here is an example to demonstrate the flexibility of neuron in prototyping a fractal neural network. Fractal neural network borrows the idea of fractal geometry that object has recursively self-similar components, aka, attached with a special recursive parameter sharing structure. It is extremely easy to implement such recursive parameter sharing topology using neuron:

class FractalNeuralNetwork (val depth: Int, val a: Operationable) 
  extends Operationable {
  val b = a.create()
  val inputDimension = (scala.math.pow(2, depth) * a.inputDimension).toInt
  val outputDimension = a.outputDimension  
  def create() = if (depth == 0) {
    a.create()
  } else {
    ((a + b) ** new FractalNeuralNetwork(depth-1, b ++ a)).create()
  }
}

Note that in FractalNeuralNetwork, a is a template that only has hyper-parameters specified, and b is a module that implement a with concrete parameters. FractalNeuralNetwork mix them in a recursive way that the number of nodes grows exponentially w.r.t. depth, while the parameter size only grows linearly in the meanwhile. This might allow a neural network to express highly nonlinear attribute interactions at the same time keeping its parameters at a reasonable size.

Clone this wiki locally