ChatLlamaCpp doesn't seem to return logprobs. #29872
Unanswered
chaoyupeng
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Checked other resources
Commit to Help
Example Code
Description
Hi LangChain Community,
I've been trying to use the LlamaCpp and the ChatLlamaCpp class, the idea is to get the log probs for the generated tokens, which is needed for my follow up work.
But I found the following issue:
Any idea I could solve this?
The idea is to have the ability to use the logprobs of the predicted tokens, as well as being able to have tool using functionality, do I need to change frameworks or model engines to achieve this?
Thx
System Info
Ubuntu 24.04
Latest LangChain and LangGrpah.
Beta Was this translation helpful? Give feedback.
All reactions