-
Notifications
You must be signed in to change notification settings - Fork 16.6k
Description
Apache Airflow version
3.1.7
If "Other Airflow 3 version" selected, which one?
No response
What happened?
I created a local development setup with CeleryExecutor. I ran the airflow celery worker command, which almost immediately caused the process to crash with following stacktrace:
python -m airflow celery worker
2026-02-26T09:59:51.012846Z [info ] setup plugin alembic.autogenerate.schemas [alembic.runtime.plugins] loc=plugins.py:37
2026-02-26T09:59:51.012950Z [info ] setup plugin alembic.autogenerate.tables [alembic.runtime.plugins] loc=plugins.py:37
2026-02-26T09:59:51.012997Z [info ] setup plugin alembic.autogenerate.types [alembic.runtime.plugins] loc=plugins.py:37
2026-02-26T09:59:51.013034Z [info ] setup plugin alembic.autogenerate.constraints [alembic.runtime.plugins] loc=plugins.py:37
2026-02-26T09:59:51.013065Z [info ] setup plugin alembic.autogenerate.defaults [alembic.runtime.plugins] loc=plugins.py:37
2026-02-26T09:59:51.013096Z [info ] setup plugin alembic.autogenerate.comments [alembic.runtime.plugins] loc=plugins.py:37
2026-02-26T09:59:51.797625Z [info ] starting stale bundle cleanup process [airflow.providers.celery.cli.celery_command] loc=celery_command.py:141
Traceback (most recent call last):
File "<PROJECT_PATH>/.venv/lib/python3.13/site-packages/airflow/providers/celery/cli/celery_command.py", line 152, in _run_stale_bundle_cleanup
sub_proc.start()
~~~~~~~~~~~~~~^^
File "/opt/homebrew/Cellar/python@3.13/3.13.12_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/multiprocessing/process.py", line 121, in start
self._popen = self._Popen(self)
~~~~~~~~~~~^^^^^^
File "/opt/homebrew/Cellar/python@3.13/3.13.12_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/multiprocessing/context.py", line 224, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/python@3.13/3.13.12_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/multiprocessing/context.py", line 289, in _Popen
return Popen(process_obj)
File "/opt/homebrew/Cellar/python@3.13/3.13.12_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/multiprocessing/popen_spawn_posix.py", line 32, in __init__
super().__init__(process_obj)
~~~~~~~~~~~~~~~~^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/python@3.13/3.13.12_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/multiprocessing/popen_fork.py", line 20, in __init__
self._launch(process_obj)
~~~~~~~~~~~~^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/python@3.13/3.13.12_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/multiprocessing/popen_spawn_posix.py", line 47, in _launch
reduction.dump(process_obj, fp)
~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/python@3.13/3.13.12_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/multiprocessing/reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^
AttributeError: Can't get local object '_run_stale_bundle_cleanup.<locals>.bundle_cleanup_main'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "<PROJECT_PATH>/.venv/lib/python3.13/site-packages/airflow/__main__.py", line 59, in <module>
main()
~~~~^^
File "<PROJECT_PATH>/.venv/lib/python3.13/site-packages/airflow/__main__.py", line 55, in main
args.func(args)
~~~~~~~~~^^^^^^
File "<PROJECT_PATH>/.venv/lib/python3.13/site-packages/airflow/cli/cli_config.py", line 49, in command
return func(*args, **kwargs)
File "<PROJECT_PATH>/.venv/lib/python3.13/site-packages/airflow/utils/cli.py", line 114, in wrapper
return f(*args, **kwargs)
File "<PROJECT_PATH>/.venv/lib/python3.13/site-packages/airflow/providers/celery/cli/celery_command.py", line 67, in wrapper
providers_configuration_loaded(func)(*args, **kwargs)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "<PROJECT_PATH>/.venv/lib/python3.13/site-packages/airflow/utils/providers_configuration_loader.py", line 54, in wrapped_function
return func(*args, **kwargs)
File "<PROJECT_PATH>/.venv/lib/python3.13/site-packages/airflow/providers/celery/cli/celery_command.py", line 318, in worker
_run_command_with_daemon_option(
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
args=args,
^^^^^^^^^^
...<4 lines>...
pid_file=worker_pid_file_path,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "<PROJECT_PATH>/.venv/lib/python3.13/site-packages/airflow/providers/celery/cli/celery_command.py", line 53, in _run_command_with_daemon_option
run_command_with_daemon_option(*args, **kwargs)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "<PROJECT_PATH>/.venv/lib/python3.13/site-packages/airflow/cli/commands/daemon_utils.py", line 86, in run_command_with_daemon_option
callback()
~~~~~~~~^^
File "<PROJECT_PATH>/.venv/lib/python3.13/site-packages/airflow/providers/celery/cli/celery_command.py", line 310, in run_celery_worker
with _serve_logs(skip_serve_logs), _run_stale_bundle_cleanup():
~~~~~~~~~~~~~~~~~~~~~~~~~^^
File "/opt/homebrew/Cellar/python@3.13/3.13.12_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/contextlib.py", line 141, in __enter__
return next(self.gen)
File "<PROJECT_PATH>/.venv/lib/python3.13/site-packages/airflow/providers/celery/cli/celery_command.py", line 156, in _run_stale_bundle_cleanup
sub_proc.terminate()
~~~~~~~~~~~~~~~~~~^^
File "/opt/homebrew/Cellar/python@3.13/3.13.12_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/multiprocessing/process.py", line 133, in terminate
self._popen.terminate()
^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'terminate'
Process finished with exit code 1
I replaced my actual project path prefix with <PROJECT_PATH> for ease of reading.
What you think should happen instead?
A valid celery worker should be created and waiting for tasks.
How to reproduce
Just run airflow celery worker command.
I also set the variable AIRFLOW__CELERY__POOL=solo to override default prefork pooling, since this also causes issues on macOS, but it's enough for my intended purpose.
Operating System
macOS Tahoe 26.3
Versions of Apache Airflow Providers
apache-airflow-providers-celery==3.16.0
Deployment
Virtualenv installation
Deployment details
I've created a .venv with airflow==3.1.7 and aforementioned Celery provider.
I created this exact same setup with Python 3.12, 3.13 and 3.14 with same result.
I use following variables, which I include in every Airflow process:
AIRFLOW_HOME=<PROJECT_PATH>
AIRFLOW__CORE__EXECUTOR=CeleryExecutor
AIRFLOW__CELERY__BROKER_URL=amqp://airflow:***@localhost:5672//
AIRFLOW__CELERY__POOL=solo
I run external services (PostgreSQL, RabbitMQ, etc.) in Docker containers.
Anything else?
I already investigated the problem and the it seems to be related to different process spawning methods. macOS uses spawn() by default, where Linux implementation uses fork(). This causes the locally defined function bundle_cleanup_main() in _run_stale_bundle_cleanup() (airflow/providers/celery/cli/celery_command.py) to fail if OS created child process by spawn(), but it runs fine with fork().
This problem occurs every single time I try to run Celery worker on macOS, unless I set the additional variable AIRFLOW__DAG_PROCESSOR__STALE_BUNDLE_CLEANUP_INTERVAL=0, which stops bundle cleanup process from being created. I've tested for this behaviour on a Linux machine and it never caused any issue. Furthermore, the problem also seems to be resolved on macOS when I explicitly call multiprocessing.set_start_method("fork"), thus causing the process to behave in the same manner as Linux.
The arguments to Process usually need to be unpickleable from within the child process. [...]
The locally defined function bundle_cleanup_main() contradicts this assumption, thus causing the bug to manifest itself on macOS.
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct