-
Notifications
You must be signed in to change notification settings - Fork 16.4k
Closed
Labels
area:providerskind:bugThis is a clearly a bugThis is a clearly a bugneeds-triagelabel for new issues that we didn't triage yetlabel for new issues that we didn't triage yetprovider:googleGoogle (including GCP) related issuesGoogle (including GCP) related issues
Description
Apache Airflow Provider(s)
Versions of Apache Airflow Providers
apache-airflow-providers-google==16.1.0
Apache Airflow version
3.0.2
Operating System
gcp kubernetes
Deployment
Official Apache Airflow Helm Chart
Deployment details
No response
What happened
I'm using an operator defined like this:
GCSToBigQueryOperator(
task_id=task_id,
bucket=consts.BACKUP_BUCKET,
source_objects=[
f'{consts.PROJECT}/reference/{{{{ ts }}}}/{{{{ params.reference_table }}}}.json',
],
destination_project_dataset_table=f'{consts.DATASET}.{consts.TABLE_NAME}',
schema_fields=dest_table_schema,
write_disposition='WRITE_TRUNCATE',
source_format='NEWLINE_DELIMITED_JSON',
params={'reference_table': consts.TABLE_NAME},
gcp_conn_id=consts.GCP_CONN_ID,
)It fails with the following logs:
[2025-07-08, 15:00:31] INFO - Connection Retrieved 'xxx_google_cloud': source="airflow.hooks.base"
[2025-07-08, 15:00:31] INFO - Using existing BigQuery table for storing data...: source="airflow.task.operators.airflow.providers.google.cloud.transfers.gcs_to_bigquery.GCSToBigQueryOperator"
[2025-07-08, 15:00:31] INFO - Getting connection using `google.auth.default()` since no explicit credentials are provided.: source="airflow.providers.google.cloud.utils.credentials_provider._CredentialProvider"
[2025-07-08, 15:00:31] INFO - Project is not included in destination_project_dataset_table: Validation.ErrorReference; using project "xxx": source="airflow.task.hooks.airflow.providers.google.cloud.hooks.bigquery.BigQueryHook"
[2025-07-08, 15:00:31] INFO - Executing: {'load': {'autodetect': True, 'createDisposition': 'CREATE_IF_NEEDED', 'destinationTable': {'projectId': 'xxx', 'datasetId': 'xxx', 'tableId': 'xxx'}, 'sourceFormat': 'NEWLINE_DELIMITED_JSON', 'sourceUris': ['gs://xxx/xxx/reference/2025-07-08T12:52:19.515000+00:00/Reference.json'], 'writeDisposition': 'WRITE_TRUNCATE', 'ignoreUnknownValues': False, 'schema': {'fields': [{'name': 'error_id', 'type': 'STRING', 'primary_key': True, 'mode': 'REQUIRED'}, {'name': 'error_type', 'type': 'STRING', 'primary_key': False, 'mode': 'REQUIRED'}, {'name': 'error_name', 'type': 'STRING', 'primary_key': False, 'mode': 'REQUIRED'}, {'name': 'check_function', 'type': 'STRING', 'primary_key': False, 'mode': 'REQUIRED'}, {'name': 'description_en', 'type': 'STRING', 'primary_key': False, 'mode': 'REQUIRED'}]}}}: source="airflow.task.operators.airflow.providers.google.cloud.transfers.gcs_to_bigquery.GCSToBigQueryOperator"
[2025-07-08, 15:00:31] INFO - Inserting job airflow_xxx_xxx_2025_07_08T12_52_19_515000_00_00_7e0440558923cf207dfead9989fb7188: source="airflow.task.hooks.airflow.providers.google.cloud.hooks.bigquery.BigQueryHook"
[2025-07-08, 15:00:34] ERROR - Task failed with exception: source="task"
TypeError: cannot serialize object of type <class 'function'>
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py", line 8[6](https://xxx/dags/xxx/runs/manual__2025-07-08T12:52:21.323094+00:00/tasks/xxx?try_number=3#6)7 in run
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py", line 1159 in _execute_task
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/operator.py", line 39[7](https://xxx/dags/xxx/runs/manual__2025-07-08T12:52:21.323094+00:00/tasks/xxx?try_number=3#7) in wrapper
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/google/cloud/transfers/gcs_to_bigquery.py", line 439 in execute
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/google/cloud/links/base.py", line [8](https://xxx/dags/xxx/runs/manual__2025-07-08T12:52:21.323094+00:00/tasks/xxx?try_number=3#8)1 in persist
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py", line 390 in xcom_push
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py", line 532 in _xcom_push
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/xcom.py", line 64 in set
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/xcom.py", line 2[9](https://xxx/dags/xxx/runs/manual__2025-07-08T12:52:21.323094+00:00/tasks/xxx?try_number=3#9)1 in serialize_value
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/serialization/serde.py", line 135 in serialize
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/serialization/serde.py", line 175 in serialize
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/serialization/serde.py", line 135 in serialize
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/serialization/serde.py", line 129 in serialize
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/serialization/serde.py", line 185 in serialize
Links implementation was changed in this version, so it's probably related: #51576
What you think should happen instead
No response
How to reproduce
Described in "What happened" section
Anything else
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct
kyungjunleeme
Metadata
Metadata
Assignees
Labels
area:providerskind:bugThis is a clearly a bugThis is a clearly a bugneeds-triagelabel for new issues that we didn't triage yetlabel for new issues that we didn't triage yetprovider:googleGoogle (including GCP) related issuesGoogle (including GCP) related issues