Super simple Python connectors for llama.cpp, including vision models (Gemma 3, Qwen2-VL) #1977
fidecastro
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I built a lightweight alternative to llama-cpp-python that stays current with llama.cpp's latest releases and enables Python integration with llama.cpp's vision models.
It works on top of a llama.cpp local compilation, which must either be compiled manually by the user or through a Dockerfile script I added to the repo. This way it's easier to stay up-to-date with the latest releases of llama.cpp. It's nowhere near as ambitious as llama-cpp-python but it can be useful for those of us that want to be tinker with the newest releases.
I hope this is as useful to you as it has been is for me. Feedback and criticism is warmly welcomed.
https://github.com/fidecastro/llama-cpp-connector
Beta Was this translation helpful? Give feedback.
All reactions