-
Notifications
You must be signed in to change notification settings - Fork 16.4k
Description
Apache Airflow version
3.0.1
If "Other Airflow 2 version" selected, which one?
No response
What happened?
Apache Airflow version:
3.0.1
Executor:
CeleryExecutor (distributed workers, multi-host setup)
Deployment method:
Docker Compose (official apache/airflow:3.0.1 images)
Summary:
When running a distributed Airflow 3.0.1 setup with CeleryExecutor and remote workers, my worker(s) consistently fail with a 405 Method Not Allowed error. The worker tries to PATCH to /api/v2/task-instances/{id}/run, but this endpoint is not implemented/exposed in the REST API, causing all DAG tasks to immediately fail.
Repro steps:
Deploy Airflow 3.0.1 central node (webserver, scheduler, API server) and remote worker nodes via Docker Compose. All use the official images, no custom pip install.
Trigger a DAG run.
Worker logs show:
´´´
ServerResponseError: Method Not Allowed
airflow.sdk.api.client.Client.patch("task-instances/{id}/run", ...)
´´´
Detailed logs and stacktrace:
´´´
/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/api/client.py:146 in start
resp = self.client.patch(f"task-instances/{id}/run", content=body.model_dump_json())
...
ServerResponseError: Method Not Allowed
´´´
What you think should happen instead?
Remote worker should be able to communicate and update the state via the API without 405 errors.
DAGs should run to completion.
What happened instead?
Every task execution fails with the above 405 error.
The worker is unable to PATCH to a non-existent endpoint.
Images used:
apache/airflow:3.0.1 (both central node and worker)
Steps already tried:
Checked for custom pip installs (none).
All images use the official version, verified with docker image ls.
Restarted from scratch, wiped all volumes.
Searched the docs and API spec: PATCH /api/v2/task-instances/{id}/run does not exist.
Possible cause (hypothesis):
Internal SDK/API client in the worker references endpoints not exposed in the public API, or the distributed DAG processing implementation is incomplete for this executor setup.
There might be a version mismatch or a bug in the distributed execution support.
Is this a regression?
Unknown. Did not test with Airflow <3.0.1 distributed DAG processing.
How to reproduce
How to Reproduce the Issue
Environment Setup
Prepare a distributed Airflow 3.0.1 environment using the official Docker images.
Use CeleryExecutor.
Set up Redis as the Celery broker.
Set up PostgreSQL as the metadata database.
The environment must have:
A central node (running webserver, scheduler, and API server).
At least one remote worker node (only running Celery worker).
Use Docker Compose (attach your docker-compose.yml if possible).
Start the services in this order:
docker-compose up -d postgres redis
Wait for DB and Redis to be healthy.
docker-compose up airflow-init
docker-compose up -d airflow-api-server airflow-scheduler
Start the Celery worker on a separate node/container, using the same DB and Redis.
Operating System
Ubuntu 22
Versions of Apache Airflow Providers
Airflow 3.0.1
Deployment
Docker-Compose
Deployment details
No response
Anything else?
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct