Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Turn on inference for OpaqueClosure #39681

Merged
merged 1 commit into from
Mar 5, 2021
Merged

Turn on inference for OpaqueClosure #39681

merged 1 commit into from
Mar 5, 2021

Conversation

Keno
Copy link
Member

@Keno Keno commented Feb 16, 2021

This turns on inference for PartialOpaque callees (but no
optimization/inlining yet and also no dynamic dispatch
to the optimized implementations). Because of the current design
and some fixes getting pulled into previous PRs, I believe this
is all that remains to be done on the inference front.

In particular, we specialize the OpaqueClosure methods on
the tuple formed by the tuple type of the environment
(at inference time) and the argument tuples. This is a bit of
an odd method specialization, but it seems like inference
is just fine with it in general. In the fullness of time,
we may want to store the specializations differently
to give more freedom to partial optimizations, but that
would require being able to re-enter inference later, which
is currently not possible.

This turns on inference for `PartialOpaque` callees (but no
optimization/inlining yet and also no dynamic dispatch
to the optimized implementations). Because of the current design
and some fixes getting pulled into previous PRs, I believe this
is all that remains to be done on the inference front.

In particular, we specialize the OpaqueClosure methods on
the tuple formed by the tuple type of the environment
(at inference time) and the argument tuples. This is a bit of
an odd method specialization, but it seems like inference
is just fine with it in general. In the fullness of time,
we may want to store the specializations differently
to give more freedom to partial optimizations, but that
would require being able to re-enter inference later, which
is currently not possible.
@Keno Keno changed the title WIP: Turn on inference for OpaqueClosure Turn on inference for OpaqueClosure Mar 4, 2021
@Keno
Copy link
Member Author

Keno commented Mar 4, 2021

Now infers sin''(x) fine, which had been my smoke test for Diffractor, so good to go from my side.

@Keno Keno merged commit e49567f into master Mar 5, 2021
@Keno Keno deleted the kf/ocinference branch March 5, 2021 00:50
@oscardssmith
Copy link
Member

Is there a level of ' where it breaks down? I.E. will sin'''''''''''(x) have good performance?

@Keno
Copy link
Member Author

Keno commented Mar 5, 2021

Right now it breaks at sin'''. Diffractor is split into 2 phases. The first one is just generated functions + opaque closure. That's what I was testing above. I think it'll be ok at 3rd or 4th order, but it's suboptimal. I have a prototype for what I'm calling Diffractor Phase 2, which uses a custom lattice element to represent the entire optic at once. That once I've tested at 10th order successfully, but there the question is how to integrate it properly. We'll need some sort of high level interface that integrates nicely with the rest of the system, but that isn't designed yet.

pushfirst!(argtypes, closure.env)
sig = argtypes_to_type(argtypes)
rt, edgecycle, edge = abstract_call_method(interp, closure.source::Method, sig, Core.svec(), false, sv)
info = OpaqueClosureCallInfo(edge)
Copy link
Sponsor Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

JET now warns me edge could be nothing and thus this call may result in an error.
I guess we may want to bail out return CallMeta(rt, false) in that case ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I'm revising this code in a follow up PR

ElOceanografo pushed a commit to ElOceanografo/julia that referenced this pull request May 4, 2021
This turns on inference for `PartialOpaque` callees (but no
optimization/inlining yet and also no dynamic dispatch
to the optimized implementations). Because of the current design
and some fixes getting pulled into previous PRs, I believe this
is all that remains to be done on the inference front.

In particular, we specialize the OpaqueClosure methods on
the tuple formed by the tuple type of the environment
(at inference time) and the argument tuples. This is a bit of
an odd method specialization, but it seems like inference
is just fine with it in general. In the fullness of time,
we may want to store the specializations differently
to give more freedom to partial optimizations, but that
would require being able to re-enter inference later, which
is currently not possible.
antoine-levitt pushed a commit to antoine-levitt/julia that referenced this pull request May 9, 2021
This turns on inference for `PartialOpaque` callees (but no
optimization/inlining yet and also no dynamic dispatch
to the optimized implementations). Because of the current design
and some fixes getting pulled into previous PRs, I believe this
is all that remains to be done on the inference front.

In particular, we specialize the OpaqueClosure methods on
the tuple formed by the tuple type of the environment
(at inference time) and the argument tuples. This is a bit of
an odd method specialization, but it seems like inference
is just fine with it in general. In the fullness of time,
we may want to store the specializations differently
to give more freedom to partial optimizations, but that
would require being able to re-enter inference later, which
is currently not possible.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants