Skip to content

[Bug]: ERROR: Missing required arguments; Expected either ('max_tokens', 'messages' and 'model')  #6421

Open
@opendeluxe

Description

@opendeluxe

Self Checks

  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (Language Policy).
  • Non-english title submitions will be closed directly ( 非英文标题的提交将会被直接关闭 ) (Language Policy).
  • Please do not modify this template :) and fill in all the required fields.

RAGFlow workspace code commit ID

efc4796

RAGFlow image version

v0.17.2-slim

Other environment information

Ubuntu 24

Actual behavior

Starting a Chat, gives the following error:

ERROR: Missing required arguments; Expected either ('max_tokens', 'messages' and 'model') or ('max_tokens', 'messages', 'model' and 'stream') arguments to be given

I have configured both OpenAI and Anthropic with my API-Keys and assigned the System Model Settings accordingly.

I could nowhere find where to set values like "max_tokens", "messages" and "model".

Image

Image

Image

Image

Expected behavior

it would either perform the chat-request or it would let me pass the missing variables.

Steps to reproduce

1. Running Ragflow with docker 
2. sign up
3. configure OpenAI and Anthropic as LLM
4. start a chat-assistant
5. write "hi"

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    🐞 bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions