Skip to content

Bulk Delete by array of UUID strings #525

@levi-katarok

Description

@levi-katarok

Describe the bug
Unable to delete using a .filter('in", array_of_ids)

To Reproduce
Steps to reproduce the behavior:

  1. Create a supabase python client (mine is under)
    self.commons['supabase']
  2. Have an array of uuids represented as strings ex:
    vector_ids = ["aed10938-65db-4678-8438-cf7684eaefd7", "94a58004-7f19-4e79-b189-0824e55d4c8a"]
  3. First I tried
    self.commons['supabase'].table('brains_vectors').delete().filter('vector_id', "in", vector_ids).execute()

The error:

raise APIError(r.json())
postgrest.exceptions.APIError: {'code': 'PGRST100', 'details': 'unexpected "[" expecting "("', 'hint': None, 'message': '"failed to parse filter
  1. Then I tried
    self.commons['supabase'].table('brains_vectors').delete().filter('vector_id', "in", tuple(vector_ids)).filter('brain_id', 'eq', self.id).execute()

The error:
postgrest.exceptions.APIError: {'code': '22P02', 'details': None, 'hint': None, 'message': 'invalid input syntax for type uuid: "\'6eafc10e-6e1d-44c6-9ba9-99f2044b30d0\'"'}

I wasn't able to convert those strings to double quotes without python changing them to single quotes.

Expected behavior
I was hoping to be able to bulk delete a bunch of vector ids for an LLM application. Right now it takes ~0.5 seconds per vector to delete in a for loop. For a 1MB file with 237 vectors, it feels like 2 mins in a long time.

Screenshots.

Desktop (please complete the following information):

  • OS: Mac OS Ventura
  • IDE: Visual Studio Code

Additional context
Supabase version: supabase==1.0.3

If anyone knows how to do this bulk delete any help is greatly appreciated. I'm happy to help makes the changes or add to documentation as well!

Thanks!

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions