Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 30 additions & 0 deletions docs/apache-airflow-providers-openlineage/guides/user.rst
Original file line number Diff line number Diff line change
Expand Up @@ -414,6 +414,36 @@ which requires the DAG-level lineage to be enabled in some OpenLineage backend s
Disabling DAG-level lineage while enabling task-level lineage might cause errors or inconsistencies.


Passing parent job information to Spark jobs
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

OpenLineage integration can automatically inject Airflow's information (namespace, job name, run id)
into Spark application properties as parent job information
(``spark.openlineage.parentJobNamespace``, ``spark.openlineage.parentJobName``, ``spark.openlineage.parentRunId``),
for :ref:`supported Operators <supported_classes:openlineage>`.
It allows Spark integration to automatically include ``parentRunFacet`` in application-level OpenLineage event,
creating a parent-child relationship between tasks from different integrations.
See `Scheduling from Airflow <https://openlineage.io/docs/integrations/spark/configuration/airflow>`_.

.. warning::

If any of the above properties are manually specified in the Spark job configuration, the integration will refrain from injecting parent job properties to ensure that manually provided values are preserved.

You can enable this automation by setting ``spark_inject_parent_job_info`` option to ``true`` in Airflow configuration.

.. code-block:: ini

[openlineage]
transport = {"type": "http", "url": "http://example.com:5000", "endpoint": "api/v1/lineage"}
spark_inject_parent_job_info = true

``AIRFLOW__OPENLINEAGE__SPARK_INJECT_PARENT_JOB_INFO`` environment variable is an equivalent.

.. code-block:: ini

AIRFLOW__OPENLINEAGE__SPARK_INJECT_PARENT_JOB_INFO=true


Troubleshooting
===============

Expand Down
13 changes: 12 additions & 1 deletion docs/exts/templates/openlineage.rst.jinja2
Original file line number Diff line number Diff line change
Expand Up @@ -18,13 +18,24 @@
#}
Core operators
==============
At the moment, two core operators supports OpenLineage. These operators function as a 'black box,'
At the moment, two core operators support OpenLineage. These operators function as a 'black box,'
capable of running any code, which might limit the extent of lineage extraction. To enhance the extraction
of lineage information, operators can utilize the hooks listed below that support OpenLineage.

- :class:`~airflow.providers.standard.operators.python.PythonOperator` (via :class:`airflow.providers.openlineage.extractors.python.PythonExtractor`)
- :class:`~airflow.providers.standard.operators.bash.BashOperator` (via :class:`airflow.providers.openlineage.extractors.bash.BashExtractor`)

Spark operators
===============
The OpenLineage integration can automatically inject information into Spark application properties when its being submitted from Airflow.
The following is a list of supported operators along with the corresponding information that can be injected.

apache-airflow-providers-google
"""""""""""""""""""""""""""""""

- :class:`~airflow.providers.google.cloud.operators.dataproc.DataprocSubmitJobOperator`
- Parent Job Information


:class:`~airflow.providers.common.sql.operators.sql.SQLExecuteQueryOperator`
============================================================================
Expand Down
4 changes: 2 additions & 2 deletions generated/provider_dependencies.json
Original file line number Diff line number Diff line change
Expand Up @@ -631,7 +631,7 @@
"google": {
"deps": [
"PyOpenSSL>=23.0.0",
"apache-airflow-providers-common-compat>=1.2.1",
"apache-airflow-providers-common-compat>=1.3.0",
"apache-airflow-providers-common-sql>=1.20.0",
"apache-airflow>=2.9.0",
"asgiref>=3.5.2",
Expand Down Expand Up @@ -973,7 +973,7 @@
},
"openlineage": {
"deps": [
"apache-airflow-providers-common-compat>=1.2.1",
"apache-airflow-providers-common-compat>=1.3.0",
"apache-airflow-providers-common-sql>=1.20.0",
"apache-airflow>=2.9.0",
"attrs>=22.2",
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.

from __future__ import annotations

import logging
from typing import TYPE_CHECKING

log = logging.getLogger(__name__)

if TYPE_CHECKING:
from airflow.providers.openlineage.utils.spark import inject_parent_job_information_into_spark_properties
else:
try:
from airflow.providers.openlineage.utils.spark import (
inject_parent_job_information_into_spark_properties,
)
except ImportError:
try:
from airflow.providers.openlineage.plugins.macros import (
lineage_job_name,
lineage_job_namespace,
lineage_run_id,
)
except ImportError:

def inject_parent_job_information_into_spark_properties(properties: dict, context) -> dict:
log.warning(
"Could not import `airflow.providers.openlineage.plugins.macros`."
"Skipping the injection of OpenLineage parent job information into Spark properties."
)
return properties

else:

def inject_parent_job_information_into_spark_properties(properties: dict, context) -> dict:
if any(str(key).startswith("spark.openlineage.parent") for key in properties):
log.info(
"Some OpenLineage properties with parent job information are already present "
"in Spark properties. Skipping the injection of OpenLineage "
"parent job information into Spark properties."
)
return properties

ti = context["ti"]
ol_parent_job_properties = {
"spark.openlineage.parentJobNamespace": lineage_job_namespace(),
"spark.openlineage.parentJobName": lineage_job_name(ti),
"spark.openlineage.parentRunId": lineage_run_id(ti),
}
return {**properties, **ol_parent_job_properties}


__all__ = ["inject_parent_job_information_into_spark_properties"]
127 changes: 127 additions & 0 deletions providers/src/airflow/providers/google/cloud/openlineage/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
# under the License.
from __future__ import annotations

import logging
import os
import pathlib
from typing import TYPE_CHECKING, Any
Expand All @@ -27,6 +28,7 @@
from google.cloud.bigquery.table import Table

from airflow.providers.common.compat.openlineage.facet import Dataset
from airflow.utils.context import Context

from airflow.providers.common.compat.openlineage.facet import (
BaseFacet,
Expand All @@ -40,9 +42,14 @@
SchemaDatasetFacetFields,
SymlinksDatasetFacet,
)
from airflow.providers.common.compat.openlineage.utils.spark import (
inject_parent_job_information_into_spark_properties,
)
from airflow.providers.google import __version__ as provider_version
from airflow.providers.google.cloud.hooks.gcs import _parse_gcs_url

log = logging.getLogger(__name__)

BIGQUERY_NAMESPACE = "bigquery"
BIGQUERY_URI = "bigquery"
WILDCARD = "*"
Expand Down Expand Up @@ -259,3 +266,123 @@ def get_from_nullable_chain(source: Any, chain: list[str]) -> Any | None:
return source
except AttributeError:
return None


def _is_openlineage_provider_accessible() -> bool:
"""
Check if the OpenLineage provider is accessible.

This function attempts to import the necessary OpenLineage modules and checks if the provider
is enabled and the listener is available.

Returns:
bool: True if the OpenLineage provider is accessible, False otherwise.
"""
try:
from airflow.providers.openlineage.conf import is_disabled
from airflow.providers.openlineage.plugins.listener import get_openlineage_listener
except ImportError:
log.debug("OpenLineage provider could not be imported.")
return False

if is_disabled():
log.debug("OpenLineage provider is disabled.")
return False

if not get_openlineage_listener():
log.debug("OpenLineage listener could not be found.")
return False

return True


def _extract_supported_job_type_from_dataproc_job(job: dict) -> str | None:
"""
Extract job type from a Dataproc job definition.

Args:
job: The Dataproc job definition.

Returns:
The job type for which the automatic OL injection is supported, if found, otherwise None.
"""
supported_job_types = ("sparkJob", "pysparkJob", "spark_job", "pyspark_job")
return next((job_type for job_type in supported_job_types if job_type in job), None)


def _replace_dataproc_job_properties(job: dict, job_type: str, new_properties: dict) -> dict:
"""
Replace the properties of a specific job type in a Dataproc job definition.

Args:
job: The original Dataproc job definition.
job_type: The key representing the job type (e.g., "sparkJob").
new_properties: The new properties to replace the existing ones.

Returns:
A modified copy of the job with updated properties.

Raises:
KeyError: If the job_type does not exist in the job or lacks a "properties" field.
"""
if job_type not in job:
raise KeyError(f"Job type '{job_type}' is missing in the job definition.")

updated_job = job.copy()
updated_job[job_type] = job[job_type].copy()
updated_job[job_type]["properties"] = new_properties

return updated_job


def inject_openlineage_properties_into_dataproc_job(
job: dict, context: Context, inject_parent_job_info: bool
) -> dict:
"""
Inject OpenLineage properties into Spark job definition.

Function is not removing any configuration or modifying the job in any other way,
apart from adding desired OpenLineage properties to Dataproc job definition if not already present.

Note:
Any modification to job will be skipped if:
- OpenLineage provider is not accessible.
- The job type is not supported.
- Automatic parent job information injection is disabled.
- Any OpenLineage properties with parent job information are already present
in the Spark job definition.

Args:
job: The original Dataproc job definition.
context: The Airflow context in which the job is running.
inject_parent_job_info: Flag indicating whether to inject parent job information.

Returns:
The modified job definition with OpenLineage properties injected, if applicable.
"""
if not inject_parent_job_info:
log.debug("Automatic injection of OpenLineage information is disabled.")
return job

if not _is_openlineage_provider_accessible():
log.warning(
"Could not access OpenLineage provider for automatic OpenLineage "
"properties injection. No action will be performed."
)
return job

if (job_type := _extract_supported_job_type_from_dataproc_job(job)) is None:
log.warning(
"Could not find a supported Dataproc job type for automatic OpenLineage "
"properties injection. No action will be performed.",
)
return job

properties = job[job_type].get("properties", {})

properties = inject_parent_job_information_into_spark_properties(properties=properties, context=context)

job_with_ol_config = _replace_dataproc_job_properties(
job=job, job_type=job_type, new_properties=properties
)
return job_with_ol_config
12 changes: 12 additions & 0 deletions providers/src/airflow/providers/google/cloud/operators/dataproc.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,9 @@
DataprocWorkflowLink,
DataprocWorkflowTemplateLink,
)
from airflow.providers.google.cloud.openlineage.utils import (
inject_openlineage_properties_into_dataproc_job,
)
from airflow.providers.google.cloud.operators.cloud_base import GoogleCloudBaseOperator
from airflow.providers.google.cloud.triggers.dataproc import (
DataprocBatchTrigger,
Expand Down Expand Up @@ -1962,6 +1965,9 @@ def __init__(
polling_interval_seconds: int = 10,
cancel_on_kill: bool = True,
wait_timeout: int | None = None,
openlineage_inject_parent_job_info: bool = conf.getboolean(
"openlineage", "spark_inject_parent_job_info", fallback=False
),
**kwargs,
) -> None:
super().__init__(**kwargs)
Expand All @@ -1983,10 +1989,16 @@ def __init__(
self.hook: DataprocHook | None = None
self.job_id: str | None = None
self.wait_timeout = wait_timeout
self.openlineage_inject_parent_job_info = openlineage_inject_parent_job_info

def execute(self, context: Context):
self.log.info("Submitting job")
self.hook = DataprocHook(gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain)
if self.openlineage_inject_parent_job_info:
self.log.info("Automatic injection of OpenLineage information into Spark properties is enabled.")
self.job = inject_openlineage_properties_into_dataproc_job(
job=self.job, context=context, inject_parent_job_info=self.openlineage_inject_parent_job_info
)
job_object = self.hook.submit_job(
project_id=self.project_id,
region=self.region,
Expand Down
2 changes: 1 addition & 1 deletion providers/src/airflow/providers/google/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ versions:

dependencies:
- apache-airflow>=2.9.0
- apache-airflow-providers-common-compat>=1.2.1
- apache-airflow-providers-common-compat>=1.3.0
- apache-airflow-providers-common-sql>=1.20.0
- asgiref>=3.5.2
- dill>=0.2.3
Expand Down
6 changes: 6 additions & 0 deletions providers/src/airflow/providers/openlineage/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,6 +77,12 @@ def selective_enable() -> bool:
return conf.getboolean(_CONFIG_SECTION, "selective_enable", fallback="False")


@cache
def spark_inject_parent_job_info() -> bool:
"""[openlineage] spark_inject_parent_job_info."""
return conf.getboolean(_CONFIG_SECTION, "spark_inject_parent_job_info", fallback="False")


@cache
def custom_extractors() -> set[str]:
"""[openlineage] extractors."""
Expand Down
10 changes: 9 additions & 1 deletion providers/src/airflow/providers/openlineage/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ versions:
dependencies:
- apache-airflow>=2.9.0
- apache-airflow-providers-common-sql>=1.20.0
- apache-airflow-providers-common-compat>=1.2.1
- apache-airflow-providers-common-compat>=1.3.0
- attrs>=22.2
- openlineage-integration-common>=1.24.2
- openlineage-python>=1.24.2
Expand Down Expand Up @@ -184,3 +184,11 @@ config:
example: ~
type: boolean
version_added: 1.11.0
spark_inject_parent_job_info:
description: |
Automatically inject OpenLineage's parent job (namespace, job name, run id) information into Spark
application properties for supported Operators.
type: boolean
default: "False"
example: ~
version_added: 1.15.0
Loading