Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions airflow-core/docs/best-practices.rst
Original file line number Diff line number Diff line change
Expand Up @@ -296,8 +296,6 @@ When you execute that code you will see:

This means that the ``get_array`` is not executed as top-level code, but ``get_task_id`` is.

.. _best_practices/dynamic_dag_generation:

Code Quality and Linting
------------------------

Expand Down Expand Up @@ -351,6 +349,7 @@ By integrating ``ruff`` into your development workflow, you can proactively addr

For more information on ``ruff`` and its integration with Airflow, refer to the `official Airflow documentation <https://airflow.apache.org/docs/apache-airflow/stable/best-practices.html>`_.

.. _best_practices/dynamic_dag_generation:

Dynamic DAG Generation
----------------------
Expand Down
12 changes: 7 additions & 5 deletions airflow-core/docs/howto/docker-compose/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -307,11 +307,13 @@ Examples of how you can extend the image with custom providers, python packages,
apt packages and more can be found in :doc:`Building the image <docker-stack:build>`.

.. note::
Creating custom images means that you need to maintain also a level of automation as you need to re-create the images
when either the packages you want to install or Airflow is upgraded. Please do not forget about keeping these scripts.
Also keep in mind, that in cases when you run pure Python tasks, you can use the
`Python Virtualenv functions <_howto/operator:PythonVirtualenvOperator>`_ which will
dynamically source and install python dependencies during runtime. With Airflow 2.8.0 Virtualenvs can also be cached.
Creating custom images means that you need to maintain also a level of
automation as you need to re-create the images when either the packages you
want to install or Airflow is upgraded. Please do not forget about keeping
these scripts. Also keep in mind, that in cases when you run pure Python
tasks, you can use :ref:`Python Virtualenv functions <howto/operator:PythonVirtualenvOperator>`,
which will dynamically source and install python dependencies during runtime.
With Airflow 2.8.0, virtualenvs can also be cached.

Special case - adding dependencies via requirements.txt file
============================================================
Expand Down
3 changes: 2 additions & 1 deletion airflow-core/docs/howto/dynamic-dag-generation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,8 @@ If you want to use variables to configure your code, you should always use
`environment variables <https://wiki.archlinux.org/title/environment_variables>`_ in your
top-level code rather than :doc:`Airflow Variables </core-concepts/variables>`. Using Airflow Variables
in top-level code creates a connection to the metadata DB of Airflow to fetch the value, which can slow
down parsing and place extra load on the DB. See the `best practices on Airflow Variables <best_practice:airflow_variables>`_
down parsing and place extra load on the DB. See
:ref:`best practices on Airflow Variables <best_practices/airflow_variables>`
to make the best use of Airflow Variables in your dags using Jinja templates.

For example you could set ``DEPLOYMENT`` variable differently for your production and development
Expand Down
2 changes: 1 addition & 1 deletion airflow-core/docs/installation/upgrading_to_airflow3.rst
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ Some changes can be automatically fixed. To do so, run the following command:
ruff check dag/ --select AIR301 --fix --preview


You can also configure these flags through configuration files. See `Configuring Ruff <Configuring Ruff>`_ for details.
You can also configure these flags through configuration files. See `Configuring Ruff <https://docs.astral.sh/ruff/configuration/>`_ for details.

Step 4: Install the Standard Providers
--------------------------------------
Expand Down
9 changes: 5 additions & 4 deletions airflow-core/docs/public-airflow-interface.rst
Original file line number Diff line number Diff line change
Expand Up @@ -46,9 +46,9 @@ MAJOR version of Airflow. On the other hand, classes and methods starting with `
as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
Airflow Interface and might change at any time.

You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
You can also use Airflow's Public Interface via the :doc:`Stable REST API <stable-rest-api-ref>` (based on the
OpenAPI specification). For specific needs you can also use the
`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref>`_ though its behaviour might change
:doc:`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref>` though its behaviour might change
in details (such as output format and available flags) so if you want to rely on those in programmatic
way, the Stable REST API is recommended.

Expand Down Expand Up @@ -408,11 +408,12 @@ Everything not mentioned in this document should be considered as non-Public Int
Sometimes in other applications those components could be relied on to keep backwards compatibility,
but in Airflow they are not parts of the Public Interface and might change any time:

* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
* :doc:`Database structure <database-erd-ref>` is considered to be an internal implementation
detail and you should not assume the structure is going to be maintained in a
backwards-compatible way.

* `Web UI <ui>`_ is continuously evolving and there are no backwards compatibility guarantees on HTML elements.
* :doc:`Web UI <ui>` is continuously evolving and there are no backwards
compatibility guarantees on HTML elements.

* Python classes except those explicitly mentioned in this document, are considered an
internal implementation detail and you should not assume they will be maintained
Expand Down
2 changes: 2 additions & 0 deletions chart/docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,8 @@ Features
* Kerberos secure configuration
* One-command deployment for any type of executor. You don't need to provide other services e.g. Redis/Database to test the Airflow.

.. _helm_chart_install:

Installing the Chart
--------------------

Expand Down
7 changes: 3 additions & 4 deletions chart/docs/installing-helm-chart-from-sources.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,17 +16,16 @@
under the License.

Installing Helm Chart from sources
----------------------------------
==================================

Released packages
'''''''''''''''''

.. jinja:: official_download_page

This page describes downloading and verifying ``Apache Airflow Official Helm Chart`` version
``{{ package_version}}`` using officially released source packages. You can also install the chart
directly from the ``airflow.apache.org`` repo as described in
`Installing the chart <index#installing-the-chart>`_.
``{{ package_version }}`` using officially released source packages. You can also install the chart
directly from the ``airflow.apache.org`` repo as described in :ref:`helm_chart_install`.
You can choose different version of the chart by selecting different version from the drop-down at
the top-left of the page.

Expand Down
3 changes: 2 additions & 1 deletion providers/amazon/docs/operators/athena/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,8 @@ Airflow offers two ways to query data using Amazon Athena.
**Amazon Athena SQL (DB API Connection):** Opt for this if you need to execute multiple queries in the same operator and it's essential to retrieve and process query results directly in Airflow, such as for sensing values or further data manipulation.

.. note::
Both connection methods uses `Amazon Web Services Connection <../../connections/aws>`_ under the hood for authentication.
Both connection methods uses :doc:`Amazon Web Services Connection <../../connections/aws>`
under the hood for authentication.
You should decide which connection method to use based on your use case.

.. toctree::
Expand Down