Skip to content

[Bug]: Do not own the dataset when list_datasets('id'= XXXXX) with other ragflow object having same API key #5398

Closed
@Unaiideko

Description

@Unaiideko

Is there an existing issue for the same bug?

  • I have checked the existing issues.

RAGFlow workspace code commit ID

...

RAGFlow image version

v0.16.0

Other environment information

I am using RAGFlow in docker containers on an AWS virtual machine that uses this AMI: Deep Learning OSS Nvidia Driver AMI GPU PyTorch 2.2.0 (Amazon Linux 2) 20240517. This ragflow has been linked by network with Ollama.

Actual behavior

I have problems when I try to retrieve a dataset which ID is being stored in a json due to the fact that I create one ragflow object with which I create and store the dataset IDs and I create other ragflow object to list the datasets with the IDs stored, I create both objects of ragflow with the same api key and base url so this must allow to retrieve the datasets created. Instead, this error is raisen:
Exception: You don't own the dataset XXXXXXXXXXXXXXXXXX

Expected behavior

It should alllow to retrieve the datasets by ID as normal, even if being retrieved with another ragflow object, as both have the same info when initialized.

Steps to reproduce

Create an object of ragflow, create a KB and then store the KB's ID. After it, create another ragflow object with the same API key as the one before and try to list the dataset with the ID stored before.

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    🐞 bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions