-
Notifications
You must be signed in to change notification settings - Fork 10.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cmake : fix VULKAN and ROCm builds #5525
Conversation
ggml-ci
ggml-ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CMake changes look good to me (and mostly refactors) though I am no CMake expert. I tested this through the nix
flake (and you can see that the flake builds successfully in CI below.)
$ nix build github:ggerganov/llama.cpp/gg/fix-cmake#{default,rocm,vulkan,cuda,mpi-cpu,mpi-cuda,opencl}
$ echo $?
0
@ggerganov you can see that rocm built successfully in https://github.com/ggerganov/llama.cpp/actions/runs/7929857796/job/21650935395?pr=5525 |
Thanks for the fix. |
* cmake : fix VULKAN and ROCm builds * cmake : fix (cont) * vulkan : fix compile warnings ggml-ci * cmake : fix ggml-ci * cmake : minor ggml-ci
* cmake : fix VULKAN and ROCm builds * cmake : fix (cont) * vulkan : fix compile warnings ggml-ci * cmake : fix ggml-ci * cmake : minor ggml-ci
ggml-vulkan
andggml-rocm
targets - link everything inggml
TODO: