Skip to content

Commit

Permalink
Multiple minor doc fixes (#14917)
Browse files Browse the repository at this point in the history
  • Loading branch information
XD-DENG authored Mar 20, 2021
1 parent 4531168 commit ed872a6
Show file tree
Hide file tree
Showing 4 changed files with 4 additions and 4 deletions.
2 changes: 1 addition & 1 deletion airflow/api_connexion/openapi/v1.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -1652,7 +1652,7 @@ components:
type: string
readOnly: true
nullable: true
description: If the DAG is SubDAG then it is the top level DAG identifier. Otherwise, nulll.
description: If the DAG is SubDAG then it is the top level DAG identifier. Otherwise, null.
is_paused:
type: boolean
nullable: true
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/backport-providers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ Backport providers only work when they are installed in the same namespace as th
This is majority of cases when you simply run pip install - it installs all packages in the same folder
(usually in ``/usr/local/lib/pythonX.Y/site-packages``). But when you install the ``apache-airflow`` and
``apache-airflow-backport-package-*`` using different methods (for example using ``pip install -e .`` or
``pip install --user`` they might be installed in different namespaces.
``pip install --user``) they might be installed in different namespaces.

If that's the case, the provider packages will not be importable (the error in such case is
``ModuleNotFoundError: No module named 'airflow.providers'``).
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/dag-serialization.rst
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ as :class:`~airflow.models.serialized_dag.SerializedDagModel` model.
The Webserver now instead of having to parse the DAG file again, reads the
serialized DAGs in JSON, de-serializes them and create the DagBag and uses it
to show in the UI. And the Scheduler does not need the actual DAG for making Scheduling decisions,
instead of using the DAG files, we use Serialized DAGs that contain all the information needing to
instead of using the DAG files, we use Serialized DAGs that contain all the information needed to
schedule the DAGs from Airflow 2.0.0 (this was done as part of :ref:`Scheduler HA <scheduler:ha>`).

One of the key features that is implemented as the part of DAG Serialization is that
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/macros-ref.rst
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ Variable Description
``{{ ti }}`` same as ``{{ task_instance }}``
``{{ params }}`` a reference to the user-defined params dictionary which can be overridden by
the dictionary passed through ``trigger_dag -c`` if you enabled
``dag_run_conf_overrides_params` in ``airflow.cfg``
``dag_run_conf_overrides_params`` in ``airflow.cfg``
``{{ var.value.my_var }}`` global defined variables represented as a dictionary
``{{ var.json.my_var.path }}`` global defined variables represented as a dictionary
with deserialized JSON object, append the path to the
Expand Down

0 comments on commit ed872a6

Please sign in to comment.