RFC: Allow inference of recursion on a decreasing integer type parameter #26172
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This makes use of the alternative already mentioned by @vtjnash in the comment.
Motivation
In JuliaArrays/StaticArrays.jl#368, I have implemented an LU decomposition for StaticMatrices that recursively calls itself on sub-matrices. It works (almost) without
@generated
functions and allows for relatively clean code. Overall, I'm quite happy with the result---except that it doesn't work on 0.7. Well, it works *), but it cannot be inferred, which is bad news for StaticArrays. Slightly simplified, the problem is thatlu(::SArray{Tuple{6,6},Float64,2,36}
callslu(::SArray{Tuple{5,5},Float64,2,25}
callsSArray{Tuple{4,4},Float64,2,16}
and so on, but that latter argument types are considered more complex, making inference bail out of the recursion. However, that coding style in probably recommendable, similar to working with tuples by recursively decomposing them. So it would be really nice to make inference work for it, which this PR achieves.*) After applying a work-around for #26083.
Drawback
If one writes a recursion where an integer type parameter is successively decremented to zero and starts it at 10000, inference will run into a stack overflow. Could be solved by putting an upper limit on
t
(orc
) here, but it's a bit unclear what that should be.