-
Notifications
You must be signed in to change notification settings - Fork 16.4k
Closed as not planned
Labels
area:providerskind:bugThis is a clearly a bugThis is a clearly a bugneeds-triagelabel for new issues that we didn't triage yetlabel for new issues that we didn't triage yetprovider:googleGoogle (including GCP) related issuesGoogle (including GCP) related issues
Description
Apache Airflow Provider(s)
Versions of Apache Airflow Providers
16.1.0
Apache Airflow version
3.0.3
Operating System
debian
Deployment
Other Docker-based deployment
Deployment details
No response
What happened
for dag that trigger by Asset without schedule,
logical_date is not available in dag_run, context
in GCSToBigQueryOperator, we use context['logical_date'] to generate job_id
that'll be an error for dags trigger by asset.
What you think should happen instead
no error
How to reproduce
trigger the dag directly is okay. => with logical_date.
trigger the dag by trigger the asset. => error.
from airflow.sdk import DAG, Asset
from airflow.providers.google.cloud.transfers.gcs_to_bigquery import (
GCSToBigQueryOperator
)
source_asset = Asset("/source_data")
with DAG(
dag_id="no_logical_date",
schedule=[source_asset],
tags=["debug"]
):
gcs_to_gbq = GCSToBigQueryOperator(
task_id="gcs_to_gbq",
bucket="example",
source_objects=["some_data.csv"],
destination_project_dataset_table="demo.example_table",
gcp_conn_id="gcp-default",
)
gcs_to_gbqAnything else
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct
Metadata
Metadata
Assignees
Labels
area:providerskind:bugThis is a clearly a bugThis is a clearly a bugneeds-triagelabel for new issues that we didn't triage yetlabel for new issues that we didn't triage yetprovider:googleGoogle (including GCP) related issuesGoogle (including GCP) related issues