-
-
Notifications
You must be signed in to change notification settings - Fork 14
Distributed Array of solutions #13
Comments
It's not immediately clear to me how this can be done (maybe JuliaParallel/DistributedArrays.jl#145 I'll followup here when I get a response. |
What you're looking for with calling the simulation is: addprocs()
@everywhere begin
import Distributions: Uniform
using DiffEqBase, OrdinaryDiffEq, DiffEqMonteCarlo
using DistributedArrays
# number of iterations
n_iter = 10
# t span and step
tspan = (0., 100.)
step = 1.
# DiffEq
pf_func = function (t,u,p,du)
vel = p[3]*p[2]*u[2]/(p[4]+u[2])
vabs = p[1]*p[5]*u[1]
du[1] = -vabs/p[1]
du[2] = (vabs-vel)/p[3]
end
# Defaults
u0 = [C0=10., C1=0.]
params = [Default=1., kcat=7.282813exp(-1), Vd=5.275000exp(0), Km=5.257432exp(0), kabs=2.090394exp(0)]
pf = ParameterizedFunction(pf_func,params)
prob = ODEProblem(pf,u0,tspan)
# MonteCarlo
prob_func1 = function (prob, i)
prob.f.params[2] = rand(Uniform(6.5, 7.5))
prob.f.params[3] = rand(Uniform(4.5, 5.5))
prob.f.params[4] = rand(Uniform(4.5, 5.5))
prob.f.params[5] = rand(Uniform(1.5, 3.))
prob
end
monte_prob = MonteCarloProblem(prob,prob_func=prob_func1)
srand(myid())
sim = solve(monte_prob, Tsit5(), saveat=collect(tspan[1]:step:tspan[2]), parallel_type=:threads, num_monte=n_iter)
end The aggregation seems to stall: dfill(sim,((n_iter for i in 1:nprocs())...)) so I'm not entirely sure that way of aggregating is correct. |
Chris, thanks a lot! using ParallelDataTransfer I guess it is not ideal though |
Sorry, this way is better ) using ParallelDataTransfer m_val.u /= nprocs() |
If you just need the statistical values, use the https://github.com/JuliaDiffEq/DiffEqMonteCarlo.jl/blob/master/test/monte.jl#L79 That makes it only save the sum of the endpoints, instead of ever saving the other values. You can use that to never even save the full array of solutions. Of course, only limited applications can benefit from this so the general solution is still necessary. |
Chris, thanks! For this case I think reduction will be sufficient, however in future full solution also may be needed. |
I got something on master. Essentially, you can use sim = solve(prob2,SRIW1(),Val{false},dt=1//2^(3),num_monte=10)
@test length(sim) == 10
sim = solve(prob2,SRIW1(),Val{false},dt=1//2^(3),num_monte=10,parallel_type=:threads)
@test length(sim) == 10 That interface is a little awkward so I'll want to test it out a little bit more, but it works. |
Release requires DistributedArrays.jl tag. JuliaParallel/DistributedArrays.jl#147 But if you have master for DistributedArrays and DiffEqMonteCarlo, the code above works. |
Chris, thanks a lot! |
Implemented. |
It would be useful to have an extension not to transfer data from remote workers to the master but to store it there, after the simulations
The text was updated successfully, but these errors were encountered: