Closed
Description
Is there an existing issue for the same bug?
- I have checked the existing issues.
RAGFlow workspace code commit ID
tag: 0.17.0 , commit ID: d683644
RAGFlow image version
v0.17.0
Other environment information
I am using RAGFlow in docker containers on an AWS virtual machine that uses this AMI: Deep Learning OSS Nvidia Driver AMI GPU PyTorch 2.2.0 (Amazon Linux 2) 20240517. This ragflow has been linked by network with Ollama.
Actual behavior
The following code:
session = assistant.create_session()
answer = session.ask(question, stream=False)
print(type(answer))
Results in:
<generator object Session.ask at 0x000001DBDDE8E510>
In fact, after running the code as an iterator like this:
list_of_ans = []
for answer in session.ask(question, stream=False):
list_of_ans.append(answer)
The obtained result is a list with Message
objects, where each one has incrementally more content of the full response until the last one has the full response of the chat model. This is the image with that list:
Expected behavior
As the Python API docs define, the expected behaviour is that session.ask(question, stream=False)
returns a Message
object containing the response (see https://ragflow.io/docs/dev/python_api_reference#returns-25).
Steps to reproduce
Just define a session with a model and run the above code.
Additional information
No response