Skip to content

feature: concurrent Redis consuming #1507

Open
@ryanrain2016

Description

Describe the bug
It seems tasks don't run in parallel

How to reproduce
Include source code:

import asyncio

from faststream import FastStream
from faststream.redis import RedisBroker
from pydantic import BaseModel

redis_dsn = 'xxxx'
rb = RedisBroker(redis_dsn)

class User(BaseModel):
    name: str
    age: int = 0

@rb.subscriber(list="users")
async def my_listener(user: User):
    await asyncio.sleep(3)
    print(user, 'from faststream')

async def producer():
    for i in range(10):
        await rb.publish(User(name="Bob", age=i), list="users")

async def main():
    await rb.connect()
    asyncio.create_task(producer())
    app = FastStream(rb)
    await app.run()

if __name__ == '__main__':
    asyncio.run(main())

And/Or steps to reproduce the behavior:

run the script above

Expected behavior
task should run in parallel

Observed behavior
task run one after one

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Assignees

No one assigned

    Labels

    RedisIssues related to `faststream.redis` module and Redis featuresenhancementNew feature or requestgood first issueGood for newcomers

    Type

    No type

    Projects

    • Status

      Waiting for merge

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions