-
Notifications
You must be signed in to change notification settings - Fork 210
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
KernelAbstractions.jl-related issues #1838
Labels
enhancement
New feature or request
Comments
KernelAbstractions uses Atomix.jl since it is otherwise impossible to use atomic operations across backends. I will update UnsafeAtomicsLLVM for LLVM 5.0, #1790 is to reduce the dependency there to actually implement them in CUDA.jl (I think I can do it in two steps). Regarding the IO output, I don't remember why we didn't capture that., That should just be a quick PR to KA's testsuite. |
I think these have been fixed. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Now that a KA.jl back-end is part of CUDA.jl and being tested on CI, I encountered a couple of issues:
I'm not sure why we're already using Atomix.jl here, ref. #1790?
cc. @vchuravy
The text was updated successfully, but these errors were encountered: