How to limit the maximum log size for a single task run in Airflow? #40880
Unanswered
HuanjieGuo
asked this question in
Q&A
Replies: 2 comments 3 replies
-
|
Not that I know. IMO producing a log file with 5gb should be avoided. Since Airflow 2.6.0 you can delete local log files whenever using remote logging. Local files are deleted after they are uploaded to the remote location. That might free up some space again. https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#delete-local-logs |
Beta Was this translation helpful? Give feedback.
1 reply
-
|
@potiuk does this feature make sense to the community? we can try to contribute it. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I have a disk to store the log for the task run.
Sometimes the task may contain some long-running log logic, and it can produce a 5Gb size log file.
This is very dangerous for a single task to produce such a large log file because it will make the disk full.
My question is how can we limit the maximum log size for a run in the airflow cluster?
Beta Was this translation helpful? Give feedback.
All reactions