Skip to content

Stream emits old copies of data #462

Open
@Wizzel1

Description

@Wizzel1

Describe the bug

Initial problem - solved

I have a realtime table that contains a column called json_data of type jsonb and allow_nullable is set to false.
When I update another column in this table and row, the stream emits data where json_data is null.

Gary Austin suggested on Discord, that this may be caused by the Postgres Changes Payload Limit because json_data is quite large, so I have updated my model and made json_data nullable.

New problem - unsolved

Besides the json_data column there is also a column status which can be pending, processing or completed.

I have set up my stream like this

  return ref
        .read(supabaseDatabaseProvider)
        .from('upload_tasks')
        .stream(primaryKey: ['id'])
        .neq('status', 'completed');

because I only want the items where status is not completed.

My code is written so that only one status can be processing at a time but the stream emits multiple values where status is processing.

I have watched the actual status in the database and everything worked as expected there.

So my question is: Why do I get "outdated" rows with processing values when on the database there is only one row with processing at a given time. Is this also related to the Payload Limit?

Metadata

Metadata

Assignees

No one assigned

    Labels

    blockedThis issue is blocked by another issuebugSomething isn't workingrealtimeThis issue or pull request is related to realtime

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions