Skip to content

Commit

Permalink
Update local_minima.md
Browse files Browse the repository at this point in the history
  • Loading branch information
Abhishek-1Bhatt committed Jun 21, 2022
1 parent 14d9ee6 commit 968c688
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions docs/src/training_tips/local_minima.md
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ stages. Strategy (3) seems to be more robust, so this is what will be demonstrat
Let's start by reducing the timespan to `(0,1.5)`:

```@example iterativefit
prob_neuralode = NeuralODE(dudt2, (0.0,1.5), Tsit5(), saveat = tsteps[tsteps .<= 1.5])
prob_neuralode = NeuralODE(dudt2, (0.0f0,1.5f0), Tsit5(), saveat = tsteps[tsteps .<= 1.5])
adtype = Optimization.AutoZygote()
optf = Optimization.OptimizationFunction((x,p) -> loss_neuralode(x), adtype)
Expand All @@ -124,7 +124,7 @@ This fits beautifully. Now let's grow the timespan and utilize the parameters
from our `(0,1.5)` fit as the initial condition to our next fit:

```@example iterativefit
prob_neuralode = NeuralODE(dudt2, (0.0,3.0), Tsit5(), saveat = tsteps[tsteps .<= 3.0])
prob_neuralode = NeuralODE(dudt2, (0.0f0,3.0f0), Tsit5(), saveat = tsteps[tsteps .<= 3.0])
optprob = Optimization.OptimizationProblem(optf, result_neuralode.u)
result_neuralode3 = Optimization.solve(optprob,
Expand All @@ -140,7 +140,7 @@ Once again a great fit. Now we utilize these parameters as the initial condition
to the full fit:

```@example iterativefit
prob_neuralode = NeuralODE(dudt2, (0.0,5.0), Tsit5(), saveat = tsteps)
prob_neuralode = NeuralODE(dudt2, (0.0f0,5.0f0), Tsit5(), saveat = tsteps)
optprob = Optimization.OptimizationProblem(optf, result_neuralode3.u)
result_neuralode4 = Optimization.solve(optprob,
ADAM(0.01), maxiters = 300,
Expand Down

0 comments on commit 968c688

Please sign in to comment.