Skip to content

Airflow dag processor exits with too many open files after sometime #49887

@asasisekar

Description

@asasisekar

Apache Airflow version

3.0.0

If "Other Airflow 2 version" selected, which one?

No response

What happened?

Upgraded Airflow from 2.10 to 3.0.0 and DagProcessorJob getting failed after few hours. with below exception
OSError: [Errno 24] Too many open files

[2025-04-25T21:30:41.765+0100] {dag_processor_job_runner.py:63} ERROR - Exception when executing DagProcessorJob
Traceback (most recent call last):
  File "/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/jobs/dag_processor_job_runner.py", line 61, in _execute
    self.processor.run()
  File "/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py", line 262, in run
    return self._run_parsing_loop()
  File "/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py", line 347, in _run_parsing_loop
    self._start_new_processes()
  File "/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py", line 894, in _start_new_processes
    processor = self._create_process(file)
  File "/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py", line 876, in _create_process
    return DagFileProcessorProcess.start(
  File "/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/dag_processing/processor.py", line 245, in start
    proc: Self = super().start(target=target, **kwargs)
  File "/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/sdk/execution_time/supervisor.py", line 446, in start
    child_comms, read_msgs = mkpipe()
  File "/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/sdk/execution_time/supervisor.py", line 177, in mkpipe
    rsock, wsock = socketpair()
  File "/var/opt/icetools/python/ICEpythonvenv310/python-3.10.1/lib/python3.10/socket.py", line 607, in socketpair
    a, b = _socket.socketpair(family, type, proto)
OSError: [Errno 24] Too many open files

What you think should happen instead?

No response

How to reproduce

nohup airflow dag-processor

DAG Processor Config

export AIRFLOW__DAG_PROCESSOR__BUNDLE_REFRESH_CHECK_INTERVAL=10
export AIRFLOW__DAG_PROCESSOR__MIN_FILE_PROCESS_INTERVAL=120
export AIRFLOW__DAG_PROCESSOR__PARSING_PROCESSES=1
export AIRFLOW__DAG_PROCESSOR__REFRESH_INTERVAL=900

Operating System

RHEL 8.8

Versions of Apache Airflow Providers

No response

Deployment

Virtualenv installation

Deployment details

No response

Anything else?

This problem occurs always for every few hours once.
Number of DAGs less than 10.

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

Labels

affected_version:3.0Issues Reported for 3.0area:corekind:bugThis is a clearly a bugpriority:highHigh priority bug that should be patched quickly but does not require immediate new release

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions