Open
Description
Describe your problem
ragflow configures local AI, and when in conversation, it ultimately calls the OpenAI() interface. I tracked the entire process and found that the num_ctx parameter was not set, causing ollama to be unable to handle long contexts
`class LocalAIChat(OllamaChat):
def __init__(self, key, model_name, base_url):
if not base_url:
raise ValueError("Local llm url cannot be None")
if base_url.split("/")[-1] != "v1":
base_url = os.path.join(base_url, "v1")
self.client = OpenAI(api_key="empty", base_url=base_url)
self.model_name = model_name.split("___")[0]`