-
Notifications
You must be signed in to change notification settings - Fork 9.7k
Issues: ggerganov/llama.cpp
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Bug: image encoding error with malloc memory
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
#10225
opened Nov 9, 2024 by
dingtine
bge-multilingual-gemma2:ERROR:hf-to-gguf:Model Gemma2Model is not supported
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
#10215
opened Nov 8, 2024 by
hellozjj
Bug: Speculative Decoding "Segmentation fault (core dumped)"
bug
Something isn't working
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
#10176
opened Nov 4, 2024 by
AbdullahMPrograms
Bug: CANN E89999
Ascend NPU
issues specific to Ascend NPUs
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
#10161
opened Nov 4, 2024 by
ninth99
Bug: --log-disable also disables output from the model
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
#10155
opened Nov 4, 2024 by
mervn
Bug: llama-quantize --help is not printed
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
#10122
opened Nov 1, 2024 by
ivanstepanovftw
Bug: SwiftUI example does not work on simulator.
bug
Something isn't working
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
#10089
opened Oct 29, 2024 by
guinmoon
Bug: Add option for explicit Metal device selection on macOS
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
#10003
opened Oct 22, 2024 by
badog-sing
Bug: Version infomation missing when using llama-cli built on mac
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
#9977
opened Oct 21, 2024 by
morgen52
Bug: Unexpected output length (Only one token response!) when set configs "-n -2 -c 256" for llama-server
bug
Something isn't working
good first issue
Good for newcomers
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
#9933
opened Oct 18, 2024 by
morgen52
Bug: gemma-2-9b-it inference speed very slow 1.73 tokens per second
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
#9906
opened Oct 16, 2024 by
ninth99
Bug: imatrix crash - nan detected in blk.1.attn_output.weight
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
#9899
opened Oct 15, 2024 by
robbiemu
Bug: Inconsistency while parsing the model using Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
llama-cli
and gguf-py
bug-unconfirmed
low severity
#9893
opened Oct 15, 2024 by
Lyutoon
llama.cpp is slow on GPU
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
Nvidia GPU
Issues specific to Nvidia GPUs
#9881
opened Oct 14, 2024 by
vineel96
Bug: Unable to build the project with HIP fatal error: 'hipblas/hipblas.h' file not found
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
stale
#9815
opened Oct 10, 2024 by
RandUser123sa
Typo on build.md?
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
stale
#9793
opened Oct 8, 2024 by
lisatwyw
Bug: No improvement for NEON?
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
stale
#9774
opened Oct 7, 2024 by
Abhranta
Bug: IQ3_M is significantly slower than IQ4_XS on AMD, is it expected?
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
#9644
opened Sep 25, 2024 by
Nekotekina
Bug: Something isn't working
good first issue
Good for newcomers
help wanted
Extra attention is needed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
llama-server
web UI resets the text selection during inference on every token update
bug
#9608
opened Sep 23, 2024 by
mashdragon
Bug: Random inputs generated automatically in llama-cli
bug-unconfirmed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
stale
#9456
opened Sep 12, 2024 by
Abhranta
Bug: llama_print_timings seems to accumulate load_time/total_time in Something isn't working
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
llama-bench
bug
#9286
opened Sep 3, 2024 by
akx
Bug: (Server) Cannot properly cancel a non-stream completion request
bug
Something isn't working
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
server
#9273
opened Sep 2, 2024 by
ngxson
Bug: Grammar readme seems incorrect
bug
Something isn't working
documentation
Improvements or additions to documentation
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
#7720
opened Jun 3, 2024 by
thekevinscott
ProTip!
Adding no:label will show everything without a label.