-
Notifications
You must be signed in to change notification settings - Fork 10.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cloud-V CI for RISC-V builds #3160
Cloud-V CI for RISC-V builds #3160
Conversation
Makefile
Outdated
MK_CFLAGS += -march=rv64gcv -mabi=lp64d | ||
MK_CXXFLAGS += -march=rv64gcv -mabi=lp64d | ||
CFLAGS += -march=rv64gcv -mabi=lp64d | ||
CXXFLAGS += -march=rv64gcv -mabi=lp64d |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks unnecessary
Makefile
Outdated
CC := riscv64-unknown-linux-gnu-gcc | ||
CXX := riscv64-unknown-linux-gnu-g++ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CC := riscv64-unknown-linux-gnu-gcc | |
CXX := riscv64-unknown-linux-gnu-g++ | |
CC := riscv64-unknown-linux-gnu-gcc | |
CXX := riscv64-unknown-linux-gnu-g++ |
@ggerganov I have replaced my Makefile with your upstream Makefile which is in master branch right now. So its good to go. |
@@ -49,7 +49,7 @@ test: $(TEST_TARGETS) | |||
./$$test_target $(CURDIR)/models/ggml-vocab-llama.gguf; \ | |||
elif [ "$$test_target" = "tests/test-tokenizer-0-falcon" ]; then \ | |||
continue; \ | |||
elif [ "$$test_target" = "tests/test-tokenizer-1" ]; then \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@alitariq4589 Are these changes correct?
* Added Cloud-V File * Replaced Makefile with original one --------- Co-authored-by: moiz.hussain <moiz.hussain@10xengineers.ai>
In this PR, I have added a CI file (.devops/cloud-v-pipeline) for testing llama.cpp on Cloud-V platform. The build will be able to run on RISC-V vector extension-enabled QEMU. We also have multiple RISC-V boards on https://dash.cloud-v.co as runners.
The details about this PR CI build can be seen here: https://cloud-v.co:8443/view/Vector%20processing/job/llama.cpp/34/
For automated builds, the credentials for @ggerganov have been created. The owner of the repository will have to just create a webhook for the repository in the repository settings and add the token of github access (limited to repo) in Cloud-V the info about both of which can be found in this link. For triggering builds manually for this repository, you can also use this CI build which is open to anyone for building ggerganov/llama.cpp repository on RISC-V runner in Cloud-V.