Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix ROCm on Windows #683

Merged
merged 5 commits into from
Jan 6, 2024
Merged

fix ROCm on Windows #683

merged 5 commits into from
Jan 6, 2024

Conversation

Cyberhan123
Copy link
Contributor

@Cyberhan123 Cyberhan123 commented Jan 5, 2024

There is no need for the m link library in the Windows environment. If we try to link, an error will occur.
I can make sure it's correct because at rwkv, that's what I do. We can also check in the rwkv release that the built library is correct.

@slaren
Copy link
Collaborator

slaren commented Jan 5, 2024

Is there any reason to not add the library to the ggml-rocm target instead?

@Cyberhan123
Copy link
Contributor Author

Cyberhan123 commented Jan 5, 2024

Is there any reason to not add the library to the ggml-rocm target instead?

On windows we need the following commands to complete the build.

mkdir build
cd build
cmake .. -G "Ninja" -DCMAKE_C_COMPILER=clang -DCMAKE_CXX_COMPILER=clang++ -DRWKV_HIPBLAS=ON -DCMAKE_BUILD_TYPE=Release -DAMDGPU_TARGETS=gfx1100
cmake --build . --config Release

We will get the following error report:

lld-link: error: could not open 'm.lib': no such file or directory
CLANG_~1: error: linker command failed with exit code 1 (use -v to see invocation)
ninja: build stopped: subcommand failed.

The conclusion from my analysis is that ROCm's clang/clang++ does not have m.lib.

@Cyberhan123
Copy link
Contributor Author

Cyberhan123 commented Jan 5, 2024

@slaren I also searched on google and I found similar questions: conda-forge/fftw-feedstock#75,

@slaren
Copy link
Collaborator

slaren commented Jan 5, 2024

I have been checking how this is done in llama.cpp, but from what I can tell llama.cpp doesn't link to m at all, or at least not explicitly. So maybe this isn't necessary? @ggerganov do you know why ggml links to m?

Otherwise, if I understand the issue, under Windows there is no need to link to the m library, and that applies both to MSVC and clang. So maybe checking for WIN32 instead of MSVC would also work:

if (WIN32)
    target_link_libraries(${TARGET} PUBLIC ${GGML_EXTRA_LIBS} ${CMAKE_THREAD_LIBS_INIT})
else()
    target_link_libraries(${TARGET} PUBLIC m ${GGML_EXTRA_LIBS} ${CMAKE_THREAD_LIBS_INIT})
endif()

@ggerganov
Copy link
Owner

I have forgotten why this if exists. Let's change it to never link m and merge it. If something breaks we will bring it back

@Cyberhan123
Copy link
Contributor Author

I also added some settings, which are the same as llama.cpp.

src/CMakeLists.txt Outdated Show resolved Hide resolved
@Cyberhan123
Copy link
Contributor Author

ok

@Cyberhan123
Copy link
Contributor Author

It looks like ci failed

@slaren
Copy link
Collaborator

slaren commented Jan 6, 2024

I have changed it to find_library as suggested in https://cliutils.gitlab.io/modern-cmake/chapters/features/small.html
Can you test if this still works with hip?

@slaren
Copy link
Collaborator

slaren commented Jan 6, 2024

I suspect the reason why it is not necessary to link with it explicitly in llama.cpp is because there are always C++ sources in the target.

@Cyberhan123
Copy link
Contributor Author

it is working

@slaren slaren merged commit 8bf3f00 into ggerganov:master Jan 6, 2024
4 checks passed
@Cyberhan123 Cyberhan123 deleted the fix_hip_windows branch January 15, 2024 02:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants