Skip to content

[Demo request] Is that possible to run with multiprocessing? #4

Open
@xareelee

Description

@xareelee

Thanks for this awesome lib. It makes throttling easily.

I have many heavy jobs which not only needs to be throttled through HTTP requests, but also needs to calculate the results heavily (CPU-bound).

The README shows the concurrent way with throttler. Is that possible running jobs on multi-processes with throttler?

For example, using joblib:

import asyncio
import time

from joblib import Parallel, delayed
from throttler import throttle

# Limit to two calls per second
@throttle(rate_limit=1, period=0.5)
async def task(i):
    print(f"{i}: {time.time()} start")
    await asyncio.sleep(5)
    print(f"{i}: {time.time()} end")
    return i


async def many_tasks(count: int):
    print("=== START ===")
    results = Parallel(n_jobs=-1)(delayed(task)(i) for i in range(count))
    print(results)
    print("=== END ===")

asyncio.run(many_tasks(14))

The above code shows error:

...
TypeError: cannot pickle 'coroutine' object

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions