Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fire package_loading callback at every require #28860

Closed
wants to merge 1 commit into from
Closed

Conversation

vchuravy
Copy link
Member

Before this change we only fired package callbacks when a top-level module was freshly loaded or in

julia/base/loading.jl

Lines 644 to 646 in d55b044

for callback in package_callbacks
invokelatest(callback, modkey)
end
, with this change we fire the callbacks on every require so that Distributed has a change to propagate the users intent.

The other solution that would come to mind is that we do a

remotecall(map, id->Base.require(id), keys(Base.loaded_modules))

as part of the worker setup, so that workers a remotecall is guaranteed to succeed.

fixes JuliaLang/Distributed.jl#56
cc: other users of this callback @MikeInnes for Requires.jl and @timholy for Revise.jl (maybe?)

@ararslan
Copy link
Member

Error During Test at none:1
  Test threw exception
  Expression: Distributed
  Some tests did not pass: 0 passed, 0 failed, 1 errored, 0 broken.

That's... cryptic.

@ararslan
Copy link
Member

Never mind, scrolling helps.

ERROR: LoadError: On worker 2:
UndefVarError: svd not defined
JuliaLang/julia#684 at ./asyncmap.jl:178
foreach at ./abstractarray.jl:1835
maptwice at ./asyncmap.jl:178
wrap_n_exec_twice at ./asyncmap.jl:154 [inlined]
#async_usemap#669 at ./asyncmap.jl:103
#async_usemap at ./none:0 [inlined]
#asyncmap#668 at ./asyncmap.jl:81 [inlined]
asyncmap at ./asyncmap.jl:81 [inlined]
JuliaLang/julia#208 at /usr/home/julia/julia-fbsd-buildbot/worker/11rel-amd64/build/usr/share/julia/stdlib/v1.1/Distributed/src/pmap.jl:26
JuliaLang/julia#112 at /usr/home/julia/julia-fbsd-buildbot/worker/11rel-amd64/build/usr/share/julia/stdlib/v1.1/Distributed/src/process_messages.jl:269
run_work_thunk at /usr/home/julia/julia-fbsd-buildbot/worker/11rel-amd64/build/usr/share/julia/stdlib/v1.1/Distributed/src/process_messages.jl:56
macro expansion at /usr/home/julia/julia-fbsd-buildbot/worker/11rel-amd64/build/usr/share/julia/stdlib/v1.1/Distributed/src/process_messages.jl:269 [inlined]
JuliaLang/julia#111 at ./task.jl:259
Stacktrace:
 [1] JuliaLang/julia#707 at ./asyncmap.jl:331 [inlined]
 [2] foreach at ./abstractarray.jl:1835 [inlined]
 [3] wait_done(::Base.AsyncCollector, ::Base.AsyncCollectorState) at ./asyncmap.jl:331
 [4] iterate(::Base.AsyncCollector, ::Base.AsyncCollectorState) at ./asyncmap.jl:345
 [5] iterate(::Base.AsyncGenerator, ::Base.AsyncGeneratorState) at ./asyncmap.jl:387
 [6] iterate at ./asyncmap.jl:383 [inlined]
 [7] iterate at ./iterators.jl:903 [inlined]
 [8] iterate at ./iterators.jl:899 [inlined]
 [9] grow_to!(::Array{Any,1}, ::Base.Iterators.Flatten{Base.AsyncGenerator}) at ./array.jl:674
 [10] _collect at ./array.jl:593 [inlined]
 [11] collect(::Base.Iterators.Flatten{Base.AsyncGenerator}) at ./array.jl:557
 [12] top-level scope at none:0
 [13] include at ./boot.jl:317 [inlined]
 [14] include_relative(::Module, ::String) at ./loading.jl:1038
 [15] include(::Module, ::String) at ./sysimg.jl:29
 [16] exec_options(::Base.JLOptions) at ./client.jl:229
 [17] _start() at ./client.jl:421
in expression starting at /usr/home/julia/julia-fbsd-buildbot/worker/11rel-amd64/build/usr/share/julia/stdlib/v1.1/Distributed/test/distributed_exec.jl:610

@vchuravy
Copy link
Member Author

Oh interesting, I should have waited for the tests to finish locally before going pushing and going to dinner.

# Make sure that call of imported binding (on master)
# works everywhere.
for w in procs()
remotecall_wait(seed, 1)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

seed isn't a thing. Did you mean Random.seed!?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes :/

@KristofferC KristofferC mentioned this pull request Aug 24, 2018
@ararslan
Copy link
Member

Bump

@timholy
Copy link
Member

timholy commented Nov 25, 2018

Will likely be breaking for Revise, but I can work with that.

More relevant is JuliaLang/Distributed.jl#56.

Copy link
Member

@andreasnoack andreasnoack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm don't feel confident in this part of the code base so I've requested that @vtjnash reviews.

@@ -0,0 +1,6 @@
authors = ["Valentin Churavy <[email protected]>"]
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These should probably go :)

@StefanKarpinski StefanKarpinski added triage This should be discussed on a triage call backport 1.0 and removed triage This should be discussed on a triage call labels Jan 31, 2019
@JeffBezanson
Copy link
Member

It doesn't seem right to call these callbacks every time the package is requested. Is this only an issue involving Distributed, or is there a more general problem as well?

@JeffBezanson JeffBezanson removed backport 1.0 triage This should be discussed on a triage call labels Jan 31, 2019
Copy link
Member

@vtjnash vtjnash left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with Jeff. It does seem like this callback is very much in the wrong place though, since it only triggers on calls to exactly require(:Foo).

@vchuravy vchuravy closed this Jan 7, 2023
@vchuravy vchuravy deleted the vc/distloading branch January 7, 2023 10:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
parallelism Parallel or distributed computation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Order dependent module loading with Distributed
7 participants