-
Notifications
You must be signed in to change notification settings - Fork 9.7k
Issues: ggerganov/llama.cpp
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
fatal error: 'hip/hip_fp16.h' file not found when building using CMake and ROCm 6.2
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#10236
opened Nov 9, 2024 by
lubosz
Bug: Nondeterministic results on AMD RDNA3 (ROCm) despite zero temperature and fixed seed
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#10197
opened Nov 6, 2024 by
Googulator
Bug: Failed to convert Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
OuteAI/OuteTTS-0.1-350M
bug-unconfirmed
medium severity
#10178
opened Nov 5, 2024 by
apepkuss
Bug: GGML_ASSERT(i01 >= 0 && i01 < ne01) failed
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#10157
opened Nov 4, 2024 by
ccreutzi
Bug: gguf tries to access newbyteorder, which was removed in numpy2.0
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#10127
opened Nov 1, 2024 by
renxida
Bug: [SYCL] SYCL + Docker
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#10113
opened Oct 31, 2024 by
easyfab
Bug: Floating Point Exceptions turned off by default, hiding fpExceptions
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#10083
opened Oct 29, 2024 by
borisweberdev
Bug: llama-server not logging to file
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#10078
opened Oct 29, 2024 by
PyroGenesis
Bug: Server /v1/chat/completions API response's model info is wrong
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#10056
opened Oct 26, 2024 by
RifeWang
Bug: llama-export-lora converts all non-F32 values to F16
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#10049
opened Oct 25, 2024 by
NWalker1208
Bug: Certain RPC Servers cause major slowdown to Host machine
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#10047
opened Oct 25, 2024 by
GoudaCouda
Bug: Vulkan backend freezes during its execution
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#10037
opened Oct 24, 2024 by
GrainyTV
Bug:Why does llama-cli choose a GPU with lower performance?
Apple Metal
https://en.wikipedia.org/wiki/Metal_(API)
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#10009
opened Oct 23, 2024 by
badog-sing
Bug: Memory Leak in llama-server after exit
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#9988
opened Oct 21, 2024 by
edwin0cheng
Bug: Crashed when using new released binaries on MacOS
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#9973
opened Oct 21, 2024 by
morgen52
Bug: WARNING: The BPE pre-tokenizer was not recognized!
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#9927
opened Oct 17, 2024 by
smileyboy2019
Bug: Failing to build using cmake on tag b3912
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#9913
opened Oct 16, 2024 by
Martin-HZK
Bug: LLAMA_MAX_LAYERS must be increased to run FatLlama 1.7T
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#9909
opened Oct 16, 2024 by
nicoboss
Bug: Cmake/LLVM fail to build with SVE support on Windows on Arm
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#9878
opened Oct 14, 2024 by
xengpro
Bug: Llama.cpp with cuda support outputs garbage response when prompt is above 30-40ish Tokens
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
stale
#9838
opened Oct 11, 2024 by
bmahabirbu
Server UI bug: corrupted generation
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
server/webui
server
stale
#9836
opened Oct 11, 2024 by
ivanstepanovftw
Bug: Load time on rpc server with multiple machines
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
stale
#9820
opened Oct 10, 2024 by
angelosathanasiadis
Bug: TypeError when YAML license field in README.md is a list during GGUF conversion
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#9819
opened Oct 10, 2024 by
gakugaku
Bug: !!Severly Performance Degration when Using llama.cpp to deploy a pruned llama3.1 model
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
#9818
opened Oct 10, 2024 by
gudehhh666
Bug: [vulkan] llama.cpp not work on Raspberry Pi 5
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
stale
#9801
opened Oct 9, 2024 by
FanShupei
Previous Next
ProTip!
Find all open issues with in progress development work with linked:pr.