-
Notifications
You must be signed in to change notification settings - Fork 16.4k
Description
Apache Airflow version
3.1.3
If "Other Airflow 2/3 version" selected, which one?
No response
What happened?
Since we upgraded to Airflow 3.1.3, we randomly get following error triggering a DagRun manually from the UI.
What you think should happen instead?
The DagRun should be triggered without any constrain violation. It seems the generated DAG run_id is a timestamp without minutes/seconds/... which could explain why it's a duplicate key, so if you already manually triggered a DAG the same DAY, it seems logical that it would fail the way it's generated now.
How to reproduce
It's hard to reproduce, for most DAG's it works, for some it fails randomly. If first though it was related to the fact that we use multiple schedulers and YugabyteDB, but the weird thing is that we never encountered this issues until 3.1.3. We also have the same error with Postgres database.
Example of a DAG having this issue:
with DAG(
default_args={
"email_on_failure": True,
"email_on_retry": False,
"retries": 2,
"retry_delay": timedelta(minutes=10),
},
dag_id=Path(__file__).stem,
description=__doc__.format(table=table_name).partition(".")[0],
doc_md=__doc__.format(table=table_name),
schedule="0 20 * * 0",
start_date=datetime(2025, 10, 1),
catchup=False,
max_active_runs=1,
):
load_data_to_postgres = GenericTransfer(
task_id="load_data_to_postgres",
source_conn_id=source_conn_id,
destination_conn_id=dest_conn_id,
destination_table=f"{target_schema_name}.{table_name.lower()}",
sql=f"sql/extract_{table_name.lower()}_oracle.sql",
preoperator=f"sql/truncate_table_postgres.jinja.sql",
params={"schema": target_schema_name, "table": table_name.lower()},
insert_args={"replace": True, "target_fields": columns},
)
Operating System
Redhat Linux
Versions of Apache Airflow Providers
No response
Deployment
Other 3rd-party Helm chart
Deployment details
No response
Anything else?
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct