Skip to content

[Bug]: OpenAI endpoint sends/returns too many characters/tokens #5545

Closed
@Snify89

Description

@Snify89

Is there an existing issue for the same bug?

  • I have checked the existing issues.

RAGFlow workspace code commit ID

latest

RAGFlow image version

latest

Other environment information

Actual behavior

The OpenAI endpoint works great with the chat ID. I have implemented it in OpenWebUI and noticed, that response actually sends too many tokens/characters.
E.g. when the LLM answers "This is a great idea", it answers something like "This is a great idea. dea." The last part of the answer is duplicated (based on the length of the entire answer, I guess). I suppose there is a miscalculation or some sort. I haven't tested it with curl or some sort, so it might be a OpenWebUI issue but I doubt it.

Expected behavior

No response

Steps to reproduce

Implement OpenAI endpoint to OpenWebUI (or any other maybe) and chat.

Additional information

No response

Metadata

Metadata

Assignees

Labels

🐞 bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions