Thread and Process-based parallelism for Python. Simple, explicit API for running tasks in parallel.
- Threads for I/O-bound tasks (HTTP requests, file operations)
- Processes for CPU-bound tasks (computations, data processing)
No magic, just straightforward parallel execution.
pip install python-asyncimport pyasync
import requests
def fetch(url):
return requests.get(url).json()
# Run 3 requests in parallel - takes ~1 second, not ~3 seconds!
results = pyasync.parallel(
lambda: fetch("https://api.example.com/users/1"),
lambda: fetch("https://api.example.com/users/2"),
lambda: fetch("https://api.example.com/users/3")
)import pyasync
from functools import partial
def heavy_compute(n):
return sum(i * i for i in range(n))
# Run computations in parallel processes (bypasses GIL)
results = pyasync.cpu_parallel(
partial(heavy_compute, 10_000_000),
partial(heavy_compute, 20_000_000),
partial(heavy_compute, 30_000_000),
timeout=10.0 # Optional timeout
)Run multiple functions in parallel threads. Returns results in order.
results = pyasync.parallel(
lambda: requests.get("https://api1.com"),
lambda: requests.get("https://api2.com"),
lambda: requests.get("https://api3.com")
)
# All 3 run simultaneously!Start a function in the background. Returns a Task.
task = pyasync.background(lambda: slow_operation())
# Do other work while it runs...
print("Working...")
# Get result when ready
result = task.result()Run a single function in the thread pool.
result = pyasync.run(lambda: requests.get("https://api.com"))Note: Functions must be picklable. Use
functools.partialinstead of lambdas.
Run multiple functions in parallel processes with true parallelism.
from functools import partial
def compute(n):
return sum(i * i for i in range(n))
results = pyasync.cpu_parallel(
partial(compute, 1_000_000),
partial(compute, 2_000_000),
partial(compute, 3_000_000),
timeout=10.0, # TimeoutError if exceeded
max_workers=4 # Limit processes
)Start a function in a background process. Returns a CpuTask with fine-grained control.
task = pyasync.cpu_background(partial(heavy_compute, 100_000_000))
# Monitor status
print(f"Running: {task.running}")
print(f"Done: {task.done}")
# Wait with timeout
try:
result = task.result(timeout=30.0)
except TimeoutError:
print("Task took too long!")
task.cancel()Run a single function in a separate process and wait for result.
try:
result = pyasync.cpu_run(partial(compute, 100_000_000), timeout=10.0)
except TimeoutError:
print("Computation took too long!")CpuTask provides fine-grained control over CPU-bound tasks:
| Property/Method | Description |
|---|---|
.done |
True if task completed |
.running |
True if task is currently running |
.cancelled |
True if task was cancelled |
.result(timeout=None) |
Wait and get result (raises TimeoutError) |
.exception(timeout=None) |
Get exception if task failed |
.cancel() |
Attempt to cancel (only works if not started) |
.add_done_callback(fn) |
Add completion callback |
For advanced control over process pools:
with pyasync.CpuExecutor(max_workers=4, timeout=30.0) as executor:
# Submit individual tasks
task1 = executor.submit(compute, 1_000_000)
task2 = executor.submit(compute, 2_000_000)
# Get results
print(task1.result())
print(task2.result())
# Or wait for all
results = executor.wait_all()
# Batch processing with map
results = list(executor.map(compute, [1_000_000, 2_000_000, 3_000_000]))$ python examples/simple_parallel.py
=== Simple Parallel Tasks ===
[Task A] Starting...
[Task B] Starting...
[Task C] Starting...
[Task A] Done!
[Task C] Done!
[Task B] Done!
Results: ['Task A completed', 'Task B completed', 'Task C completed']
Total time: 2.01s (longest task was 2s)
$ python examples/cpu_parallel_tasks.py
============================================================
CPU-Bound Parallel Tasks Example
============================================================
1. Parallel Prime Counting
----------------------------------------
Ranges: [(1, 100000), (100000, 300000), (300000, 500000), (500000, 1000000)]
Primes found: [9592, 16405, 15541, 36960]
Total primes: 78498
Sequential time: 1.11s
Parallel time: 0.76s
Speedup: 1.5x
2. Background Task with Monitoring
----------------------------------------
Task started...
done: False
cancelled: False
Task completed!
done: True
result: 9592 primes found
$ python examples/parallel_api_calls.py
=== Parallel API Calls ===
Fetching 3 users in parallel...
- Leanne Graham (Sincere@april.biz)
- Ervin Howell (Shanna@melissa.tv)
- Clementine Bauch (Nathan@yesenia.net)
| Use Case | Function | Why |
|---|---|---|
| Multiple HTTP requests | parallel() |
I/O-bound, threads work great |
| File operations | parallel() |
I/O-bound |
| Data processing | cpu_parallel() |
CPU-bound, needs true parallelism |
| Image/video processing | cpu_parallel() |
CPU-bound |
| Long computation with timeout | cpu_run(fn, timeout=10) |
Fine-grained control |
| Background computation | cpu_background() |
Monitor and cancel if needed |
python -m unittest discover -s tests -vMIT