Skip to content

Error in worker logs: KeyError: 'task_name' #49966

@nicolamarangoni

Description

@nicolamarangoni

Apache Airflow version

3.0.0

If "Other Airflow 2 version" selected, which one?

No response

What happened?

I'm testing a fresh 3.0.0 installation and started the worker with the command airflow celery worker, after the usual logs, I see the wollowing lines in the logs of the worker service:

[tasks]
. execute_workload
--- Logging error ---
Traceback (most recent call last):
File "/usr/local/lib/python3.12/logging/__init__.py", line 464, in format
return self._format(record)
File "/usr/local/lib/python3.12/logging/__init__.py", line 460, in _format
return self._fmt % values
KeyError: 'task_name'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.12/logging/__init__.py", line 1160, in emit
msg = self.format(record)
....

The service looks healthy however, but the installation has no DAGs at the moment therefore I didn't testet some DAG executions yet.

What you think should happen instead?

The KeyError: 'task_name' should not happen.

How to reproduce

Install a fresh 3.0.0 from the greenfield and launch the worker with the command:
airflow celery worker

Operating System

Official docker image,

Versions of Apache Airflow Providers

No response

Deployment

Other

Deployment details

Official docker image running on AWS ECS.

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions