Open
Description
Hi, there!
It seems that after a while I start to get the same response over and over again, even if I ask use another prompt. It happened with 3 different chats while I was testing using the some model.
Settings:
- Linux (ArchLinux), zsh
elia
installed viapipx
- Using ollama
- Model phi3
- In the modelfile, it has "num_keep=4" which means it should keep the last 4 messages in context.
I will try with other models later. Please let me know if you need more info.
Edit: I have just done some tests with another model that does not have "num_keep" in its modelfile (stablelm2:zephyr) and it seems that it has the same problem: after a while it keeps repeating info about previous inferences.
Thanks again. Regards
Metadata
Metadata
Assignees
Labels
No labels