-
Notifications
You must be signed in to change notification settings - Fork 16.4k
Closed as duplicate of#52501
Closed as duplicate of#52501
Copy link
Labels
area:loggingarea:providerskind:bugThis is a clearly a bugThis is a clearly a bugkind:metaHigh-level information important to the communityHigh-level information important to the community
Description
Body
When remote logging enabled, am seeing connection errors in scheduler.
[2025-05-14T07:38:05.993+0000] {base_aws.py:609} WARNING - Unable to find AWS Connection ID 's3_conn', switching to empty.
[2025-05-14T07:38:05.995+0000] {base_aws.py:185} INFO - No connection ID provided. Fallback on boto3 credential strategy (region_name=None). See: https://boto3.amazonaws.com/v1/documentation
/api/latest/guide/configuration.html
[2025-05-14T07:38:06.273+0000] {scheduler_job_runner.py:450} INFO - 1 tasks up for execution:
<TaskInstance: example_xcom_args.print_value manual__2025-05-14T07:38:02.428050+00:00 [scheduled]>
[2025-05-14T07:38:06.274+0000] {scheduler_job_runner.py:522} INFO - DAG example_xcom_args has 0/16 running and queued tasks
[2025-05-14T07:38:06.275+0000] {scheduler_job_runner.py:661} INFO - Setting the following tasks to queued state:
<TaskInstance: example_xcom_args.print_value manual__2025-05-14T07:38:02.428050+00:00 [scheduled]>
[2025-05-14T07:38:06.280+0000] {scheduler_job_runner.py:767} INFO - Trying to enqueue tasks: [<TaskInstance: example_xcom_args.print_value manual__2025-05-14T07:38:02.428050+00:00 [scheduled
]>] for executor: LocalExecutor(parallelism=32)
[2025-05-14T07:38:08.936+0000] {s3_task_handler.py:124} ERROR - Could not verify previous log to append
Traceback (most recent call last):
File "/opt/airflow/providers/amazon/src/airflow/providers/amazon/aws/log/s3_task_handler.py", line 120, in write
if append and self.s3_log_exists(remote_log_location):
File "/opt/airflow/providers/amazon/src/airflow/providers/amazon/aws/log/s3_task_handler.py", line 81, in s3_log_exists
return self.hook.check_for_key(remote_log_location)
File "/opt/airflow/providers/amazon/src/airflow/providers/amazon/aws/hooks/s3.py", line 153, in wrapper
return func(*bound_args.args, **bound_args.kwargs)
File "/opt/airflow/providers/amazon/src/airflow/providers/amazon/aws/hooks/s3.py", line 126, in wrapper
return func(*bound_args.args, **bound_args.kwargs)
File "/opt/airflow/providers/amazon/src/airflow/providers/amazon/aws/hooks/s3.py", line 953, in check_for_key
obj = self.head_object(key, bucket_name)
File "/opt/airflow/providers/amazon/src/airflow/providers/amazon/aws/hooks/s3.py", line 153, in wrapper
return func(*bound_args.args, **bound_args.kwargs)
File "/opt/airflow/providers/amazon/src/airflow/providers/amazon/aws/hooks/s3.py", line 126, in wrapper
return func(*bound_args.args, **bound_args.kwargs)
File "/opt/airflow/providers/amazon/src/airflow/providers/amazon/aws/hooks/s3.py", line 934, in head_object
return self.get_conn().head_object(Bucket=bucket_name, Key=key)
File "/usr/local/lib/python3.9/site-packages/botocore/client.py", line 569, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python3.9/site-packages/botocore/client.py", line 1005, in _make_api_call
http, parsed_response = self._make_request(
To produce:
Set the remote logging enabled in airflow.cfg and connection id.
Committer
- I acknowledge that I am a maintainer/committer of the Apache Airflow project.
vinitsharswat
Metadata
Metadata
Assignees
Labels
area:loggingarea:providerskind:bugThis is a clearly a bugThis is a clearly a bugkind:metaHigh-level information important to the communityHigh-level information important to the community