Open
Description
Thanks for this awesome lib. It makes throttling easily.
I have many heavy jobs which not only needs to be throttled through HTTP requests, but also needs to calculate the results heavily (CPU-bound).
The README shows the concurrent way with throttler. Is that possible running jobs on multi-processes with throttler?
For example, using joblib:
import asyncio
import time
from joblib import Parallel, delayed
from throttler import throttle
# Limit to two calls per second
@throttle(rate_limit=1, period=0.5)
async def task(i):
print(f"{i}: {time.time()} start")
await asyncio.sleep(5)
print(f"{i}: {time.time()} end")
return i
async def many_tasks(count: int):
print("=== START ===")
results = Parallel(n_jobs=-1)(delayed(task)(i) for i in range(count))
print(results)
print("=== END ===")
asyncio.run(many_tasks(14))
The above code shows error:
...
TypeError: cannot pickle 'coroutine' object
Metadata
Metadata
Assignees
Labels
No labels