Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cmake : fix VULKAN and ROCm builds #5525

Merged
merged 5 commits into from
Feb 16, 2024
Merged

cmake : fix VULKAN and ROCm builds #5525

merged 5 commits into from
Feb 16, 2024

Conversation

ggerganov
Copy link
Owner

@ggerganov ggerganov commented Feb 16, 2024

  • Do not create ggml-vulkan and ggml-rocm targets - link everything in ggml
  • Improve CMakeLists.txt indentations + minor fixes
  • Fix some Vulkan compile warnings which were not visible before this change

TODO:

  • Test ROCm builds

Copy link
Collaborator

@philiptaron philiptaron left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CMake changes look good to me (and mostly refactors) though I am no CMake expert. I tested this through the nix flake (and you can see that the flake builds successfully in CI below.)

$ nix build github:ggerganov/llama.cpp/gg/fix-cmake#{default,rocm,vulkan,cuda,mpi-cpu,mpi-cuda,opencl}
$ echo $?
0

@philiptaron
Copy link
Collaborator

@ggerganov you can see that rocm built successfully in https://github.com/ggerganov/llama.cpp/actions/runs/7929857796/job/21650935395?pr=5525

@ggerganov ggerganov merged commit 5bf2b94 into master Feb 16, 2024
62 of 63 checks passed
@ggerganov ggerganov deleted the gg/fix-cmake branch February 16, 2024 17:06
@0cc4m
Copy link
Collaborator

0cc4m commented Feb 16, 2024

Thanks for the fix.

jordankanter pushed a commit to jordankanter/llama.cpp that referenced this pull request Mar 13, 2024
* cmake : fix VULKAN and ROCm builds

* cmake : fix (cont)

* vulkan : fix compile warnings

ggml-ci

* cmake : fix

ggml-ci

* cmake : minor

ggml-ci
hodlen pushed a commit to hodlen/llama.cpp that referenced this pull request Apr 1, 2024
* cmake : fix VULKAN and ROCm builds

* cmake : fix (cont)

* vulkan : fix compile warnings

ggml-ci

* cmake : fix

ggml-ci

* cmake : minor

ggml-ci
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants