Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

codegen: start to remove the ability to call back into inference #54655

Merged
merged 3 commits into from
Jun 6, 2024

Conversation

vtjnash
Copy link
Sponsor Member

@vtjnash vtjnash commented Jun 3, 2024

Continuing the development in #53219, according to the plan in https://hackmd.io/@vtjnash/codeinstances, this further separates the meaning behind CodeInfo and CodeInstance, such that CodeInstance can only be used as a call target, and cannot be used for code generation, while CodeInfo can only be used to generate code (or for reflection on what code would be generated), and cannot be used as a call target. Basically, the eventual idea is that CodeInfo will only show up now as an input (e.g. for doing inlining or codegen) and is ephemeral, while a CodeInstance is what shows up in a cache (e.g. as a callable object).

@vtjnash vtjnash added the compiler:codegen Generation of LLVM IR and native code label Jun 3, 2024
@vtjnash vtjnash force-pushed the jn/codegen-codeinfo-only branch 3 times, most recently from 3622407 to 4d4bc60 Compare June 5, 2024 01:30
This is preparation for making codegen purely a transform of CodeInfo ->
Module* without involving any of the caches (at the C++ level), so that
this can instead be driven by caches managed entirely at the Julia
inference level.

Removes the test for code_llvm lookup behavior, since codegen no longer
does lookup, so that is no longer relevant to test.
…d of jl_type_infer

Removes last remaining use of SOURCE_MODE_FORCE_SOURCE_UNCACHED, allowing it to be eliminated.
@vtjnash vtjnash added the status:merge me PR is reviewed. Merge when all tests are passing label Jun 5, 2024
fargs[0] = jl_get_global(CC, jl_symbol("typeinf_code"));
fargs[2] = (jl_value_t*)mi;
fargs[3] = jl_true;
ci = (jl_code_info_t*)jl_apply(fargs, 4);
Copy link
Sponsor Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this not calling jl_typeinf_func? Or typeinf_ext_toplevel instead of constructing the NativeInterpreter manually?

Copy link
Sponsor Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Those do not return source code, while the point of this function is to implement a basic version of code_typed, for debugging in gdb when things are looking broken

Copy link
Sponsor Member

@vchuravy vchuravy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be fine for GPUCompiler. If you are planning to remove lookup from jl_ci_cache_lookup and thus jl_create_native we should talk about a strategy for GPUCompiler.

Currently GPUCompiler does: https://github.com/JuliaGPU/GPUCompiler.jl/blob/8b513be9e2230fe0dd1905b805e25fa049b24d1d/src/jlgen.jl#L603-L617

I did run test locally for GPUCompiler and all passed.

cc: @maleadt

@vtjnash
Copy link
Sponsor Member Author

vtjnash commented Jun 6, 2024

we should talk about a strategy for GPUCompiler.

The plan is to require a CodeInstance in that IR position instead of a MethodInstance, so that the lookup is no longer necessary

@vtjnash vtjnash merged commit ec32170 into master Jun 6, 2024
6 of 8 checks passed
@vtjnash vtjnash deleted the jn/codegen-codeinfo-only branch June 6, 2024 11:45
@oscardssmith oscardssmith removed the status:merge me PR is reviewed. Merge when all tests are passing label Jun 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
compiler:codegen Generation of LLVM IR and native code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants