Skip to content

Can't use logical_date in GCSToBigQueryOperator #53634

@obarisk

Description

@obarisk

Apache Airflow Provider(s)

google

Versions of Apache Airflow Providers

16.1.0

Apache Airflow version

3.0.3

Operating System

debian

Deployment

Other Docker-based deployment

Deployment details

No response

What happened

for dag that trigger by Asset without schedule,
logical_date is not available in dag_run, context

in GCSToBigQueryOperator, we use context['logical_date'] to generate job_id
that'll be an error for dags trigger by asset.

What you think should happen instead

no error

How to reproduce

trigger the dag directly is okay. => with logical_date.
trigger the dag by trigger the asset. => error.

from airflow.sdk import DAG, Asset
from airflow.providers.google.cloud.transfers.gcs_to_bigquery import (
  GCSToBigQueryOperator
)

source_asset = Asset("/source_data")

with DAG(
  dag_id="no_logical_date",
  schedule=[source_asset],
  tags=["debug"]
):
  gcs_to_gbq = GCSToBigQueryOperator(
    task_id="gcs_to_gbq",
    bucket="example",
    source_objects=["some_data.csv"],
    destination_project_dataset_table="demo.example_table",
    gcp_conn_id="gcp-default",
  )

  gcs_to_gbq

Anything else

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:providerskind:bugThis is a clearly a bugneeds-triagelabel for new issues that we didn't triage yetprovider:googleGoogle (including GCP) related issues

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions