Skip to content

[Question]: LocalAIChat How to set num_ctx in ollama #5743

Open
@dawn9551

Description

@dawn9551

Describe your problem

ragflow configures local AI, and when in conversation, it ultimately calls the OpenAI() interface. I tracked the entire process and found that the num_ctx parameter was not set, causing ollama to be unable to handle long contexts

`class LocalAIChat(OllamaChat):

def __init__(self, key, model_name, base_url):
    if not base_url:
        raise ValueError("Local llm url cannot be None")
    if base_url.split("/")[-1] != "v1":
        base_url = os.path.join(base_url, "v1")
    self.client = OpenAI(api_key="empty", base_url=base_url)
    self.model_name = model_name.split("___")[0]`

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions