-
Notifications
You must be signed in to change notification settings - Fork 16.4k
Closed
Labels
Description
Apache Airflow version
Other Airflow 2 version (please specify below)
What happened
Airflow version 2.4.3 running on MacOS, running a DAG using DockerOperators fails in combination with a local standalone deployment:
*** Reading local file: /Users/schustmi/airflow/logs/dag_id=test/run_id=manual__2022-12-20T09:42:32.767926+00:00/task_id=docker_task/attempt=1.log
[2022-12-20, 09:42:35 UTC] {taskinstance.py:1165} INFO - Dependencies all met for <TaskInstance: test.docker_task manual__2022-12-20T09:42:32.767926+00:00 [queued]>
[2022-12-20, 09:42:35 UTC] {taskinstance.py:1165} INFO - Dependencies all met for <TaskInstance: test.docker_task manual__2022-12-20T09:42:32.767926+00:00 [queued]>
[2022-12-20, 09:42:35 UTC] {taskinstance.py:1362} INFO -
--------------------------------------------------------------------------------
[2022-12-20, 09:42:35 UTC] {taskinstance.py:1363} INFO - Starting attempt 1 of 1
[2022-12-20, 09:42:35 UTC] {taskinstance.py:1364} INFO -
--------------------------------------------------------------------------------
[2022-12-20, 09:42:35 UTC] {taskinstance.py:1383} INFO - Executing <Task(DockerOperator): docker_task> on 2022-12-20 09:42:32.767926+00:00
[2022-12-20, 09:42:35 UTC] {standard_task_runner.py:55} INFO - Started process 98208 to run task
[2022-12-20, 09:42:35 UTC] {standard_task_runner.py:82} INFO - Running: ['airflow', 'tasks', 'run', 'test', 'docker_task', 'manual__2022-12-20T09:42:32.767926+00:00', '--job-id', '17', '--raw', '--subdir', 'DAGS_FOLDER/dag.py', '--cfg-path', '/var/folders/45/1tkl8h1d3tvf2q72t8p6bw5r0000gn/T/tmpe6h4cv9f']
[2022-12-20, 09:42:35 UTC] {standard_task_runner.py:83} INFO - Job 17: Subtask o
[2022-12-20, 09:42:35 UTC] {task_command.py:376} INFO - Running <TaskInstance: test.docker_task manual__2022-12-20T09:42:32.767926+00:00 [running]> on host schustmi-mac-work.fritz.box
[2022-12-20, 09:42:35 UTC] {taskinstance.py:1590} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=airflow
AIRFLOW_CTX_DAG_ID=test
AIRFLOW_CTX_TASK_ID=docker_task
AIRFLOW_CTX_EXECUTION_DATE=2022-12-20T09:42:32.767926+00:00
AIRFLOW_CTX_TRY_NUMBER=1
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-12-20T09:42:32.767926+00:00
[2022-12-20, 09:42:35 UTC] {local_task_job.py:159} INFO - Task exited with return code Negsignal.SIGSEGV
[2022-12-20, 09:42:35 UTC] {taskinstance.py:2623} INFO - 0 downstream tasks scheduled from follow-on schedule check
What you think should happen instead
On other operating systems, the same combination of standalone deployment and DockerOperators works.
How to reproduce
pip install apache-airflow==2.4.3 apache-airflow-providers-docker==3.3.0
airflow standaloneThen run the following DAG:
from airflow import DAG
from airflow.providers.docker.operators.docker import DockerOperator
from datetime import datetime
with DAG(dag_id="test", schedule="@once", start_date=datetime.utcnow()) as dag:
docker_task = DockerOperator(image="alpine:latest", command="echo test", task_id="docker_task")Operating System
MacOS Ventura 13.0 (22A380)
Versions of Apache Airflow Providers
apache-airflow-providers-common-sql 1.3.1
apache-airflow-providers-docker 3.3.0
apache-airflow-providers-ftp 3.2.0
apache-airflow-providers-http 4.1.0
apache-airflow-providers-imap 3.1.0
apache-airflow-providers-sqlite 3.3.1
Deployment
Virtualenv installation
Deployment details
No response
Anything else
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct