Skip to content

max_active_runs is 64, but with 16 active runs it says the constraint is reached #57604

@DonHaul

Description

@DonHaul

Apache Airflow version

Other Airflow 2/3 version (please specify below)

If "Other Airflow 2/3 version" selected, which one?

3.0.1

What happened?

I have recentrly increase the number of max_active_runs to 64, but the scheduler keeps showing the following error:
dag cannot be started due to dag max_active_runs constraint; active_runs=16 max_active_runs=64 which
doenst make much sense.

Here are some examples:

2025-10-31T08:44:00.092+0000] {scheduler_job_runner.py:1802} INFO - dag cannot be started due to dag max_active_runs constraint; active_runs=16 max_active_runs=64 dag_id=hep_create_dag run_id=1f537f7a-5022-47e1-bfa3-ec2f150f4571
[2025-10-31T08:44:00.094+0000] {scheduler_job_runner.py:1802} INFO - dag cannot be started due to dag max_active_runs constraint; active_runs=16 max_active_runs=64 dag_id=hep_create_dag run_id=12401d58-e44c-40e2-87f8-009725832346
[2025-10-31T08:44:00.095+0000] {scheduler_job_runner.py:1802} INFO - dag cannot be started due to dag max_active_runs constraint; active_runs=16 max_active_runs=64 dag_id=hep_create_dag run_id=e3ed3ee4-b1c6-4577-9fcb-63a8c10ba0e6
[2025-10-31T08:44:00.096+0000] {scheduler_job_runner.py:1802} INFO - dag cannot be started due to dag max_active_runs constraint; active_runs=16 max_active_runs=64 dag_id=hep_create_dag run_id=0ee100f7-08e1-4c8a-bcb4-e1aa2c5eabde
[2025-10-31T08:44:00.097+0000] {scheduler_job_runner.py:1802} INFO - dag cannot be started due to dag max_active_runs constraint; active_runs=16 max_active_runs=64 dag_id=hep_create_dag run_id=ee99ce33-b5f6-4ab6-80d2-9300a95e6180
[2025-10-31T08:44:00.098+0000] {scheduler_job_runner.py:1802} INFO - dag cannot be started due to dag max_active_runs constraint; active_runs=16 max_active_runs=64 dag_id=hep_create_dag run_id=68b10c34-3371-4aa4-997f-9d22da87271a

My guess is that dag.max_active_runs and dag_run.max_active_runs can at times be different things:

if active_runs >= dag.max_active_runs:
# todo: delete all candidate dag runs for this dag from list right now
self.log.info(
"dag cannot be started due to dag max_active_runs constraint; "
"active_runs=%s max_active_runs=%s dag_id=%s run_id=%s",
active_runs,
dag_run.max_active_runs,

What you think should happen instead?

The number of dag runs should ramp up to 64

How to reproduce

Have initially the max_active_runs at 16, redeploy with max_active_runs at 64

Operating System

Debian GNU/Linux 12 (bookworm)

Versions of Apache Airflow Providers

No response

Deployment

Official Apache Airflow Helm Chart

Deployment details

airflow.cfg core section:

[core]
dags_folder = /opt/airflow/dags
hostname_callable = airflow.utils.net.getfqdn
might_contain_dag_callable = airflow.utils.file.might_contain_dag_via_default_heuristic
default_timezone = utc
executor = CeleryExecutor
auth_manager = airflow.providers.fab.auth_manager.fab_auth_manager.FabAuthManager
simple_auth_manager_users = admin:admin
simple_auth_manager_all_admins = False
# simple_auth_manager_passwords_file = 
parallelism = 64
max_active_tasks_per_dag = 66
dags_are_paused_at_creation = false
max_active_runs_per_dag = 64
max_consecutive_failed_dag_runs_per_dag = 0
# mp_start_method = 
load_examples = false
plugins_folder = /opt/airflow/plugins
execute_tasks_new_python_interpreter = False
fernet_key = N0REWDNyQ1ExZm9WRXRtZDhWQzR0aEphOFh1bXhQOEI=
donot_pickle = True
dagbag_import_timeout = 30.0
dagbag_import_error_tracebacks = True
dagbag_import_error_traceback_depth = 2
default_impersonation = 
security = 
unit_test_mode = False
allowed_deserialization_classes = airflow.*
allowed_deserialization_classes_regexp = 
killed_task_cleanup_time = 60
dag_run_conf_overrides_params = True
dag_discovery_safe_mode = True
dag_ignore_file_syntax = glob
default_task_retries = 0
default_task_retry_delay = 300
max_task_retry_delay = 86400
default_task_weight_rule = downstream
task_success_overtime = 20
default_task_execution_timeout = 
min_serialized_dag_update_interval = 30
compress_serialized_dags = False
min_serialized_dag_fetch_interval = 10
max_num_rendered_ti_fields_per_task = 30
xcom_backend = airflow.sdk.execution_time.xcom.BaseXCom
lazy_load_plugins = True
lazy_discover_providers = True
hide_sensitive_var_conn_fields = True
sensitive_var_conn_names = 
default_pool_task_slot_count = 128
max_map_length = 4096
daemon_umask = 0o077

i can provide more details if required

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions