Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions airflow-core/docs/installation/upgrading.rst
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ In some cases the upgrade happens automatically - it depends if in your deployme
built-in as post-install action. For example when you are using :doc:`helm-chart:index` with
post-upgrade hooks enabled, the database upgrade happens automatically right after the new software
is installed. Similarly all Airflow-As-A-Service solutions perform the upgrade automatically for you,
when you choose to upgrade airflow via their UI.
when you choose to upgrade Airflow via their UI.

How to upgrade
==============
Expand All @@ -74,7 +74,7 @@ you access to Airflow ``CLI`` :doc:`/howto/usage-cli` and the database.
Offline SQL migration scripts
=============================
If you want to run the upgrade script offline, you can use the ``-s`` or ``--show-sql-only`` flag
to get the SQL statements that would be executed. You may also specify the starting airflow version with the ``--from-version`` flag and the ending airflow version with the ``-n`` or ``--to-version`` flag. This feature is supported in Postgres and MySQL
to get the SQL statements that would be executed. You may also specify the starting Airflow version with the ``--from-version`` flag and the ending Airflow version with the ``-n`` or ``--to-version`` flag. This feature is supported in Postgres and MySQL
from Airflow 2.0.0 onward.

Sample usage for Airflow version 2.7.0 or greater:
Expand Down
8 changes: 4 additions & 4 deletions airflow-core/docs/security/kerberos.rst
Original file line number Diff line number Diff line change
Expand Up @@ -45,10 +45,10 @@ To enable Kerberos you will need to generate a (service) key tab.
# in the kadmin.local or kadmin shell, create the airflow principal
kadmin: addprinc -randkey airflow/fully.qualified.domain.name@YOUR-REALM.COM

# Create the airflow keytab file that will contain the airflow principal
# Create the Airflow keytab file that will contain the Airflow principal
kadmin: xst -norandkey -k airflow.keytab airflow/fully.qualified.domain.name

Now store this file in a location where the airflow user can read it (chmod 600). And then add the following to
Now store this file in a location where the Airflow user can read it (chmod 600). And then add the following to
your ``airflow.cfg``

.. code-block:: ini
Expand Down Expand Up @@ -103,9 +103,9 @@ Launch the ticket renewer by
To support more advanced deployment models for using kerberos in standard or one-time fashion,
you can specify the mode while running the ``airflow kerberos`` by using the ``--one-time`` flag.

a) standard: The airflow kerberos command will run endlessly. The ticket renewer process runs continuously every few seconds
a) standard: The Airflow kerberos command will run endlessly. The ticket renewer process runs continuously every few seconds
and refreshes the ticket if it has expired.
b) one-time: The airflow kerberos will run once and exit. In case of failure the main task won't spin up.
b) one-time: The Airflow kerberos will run once and exit. In case of failure the main task won't spin up.

The default mode is standard.

Expand Down
4 changes: 2 additions & 2 deletions airflow-core/docs/security/sbom.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ of the software dependencies.
The general use case for such files is to help assess and manage risks. For instance a quick lookup against your SBOM files can help identify if a CVE (Common Vulnerabilities and Exposures) in a
library is affecting you.

By default, Apache Airflow SBOM files are generated for airflow core with all providers. In the near future we aim at generating SBOM files per provider and also provide them for docker standard images.
By default, Apache Airflow SBOM files are generated for Airflow core with all providers. In the near future we aim at generating SBOM files per provider and also provide them for docker standard images.

Each airflow version has its own SBOM files, one for each supported python version.
Each Airflow version has its own SBOM files, one for each supported python version.
You can find them `here <https://airflow.apache.org/docs/apache-airflow/stable/sbom>`_.
2 changes: 1 addition & 1 deletion airflow-core/docs/security/security_model.rst
Original file line number Diff line number Diff line change
Expand Up @@ -240,7 +240,7 @@ in the Scheduler and API Server processes.
Deploying and protecting Airflow installation
.............................................

Deployment Managers are also responsible for deploying airflow and make it accessible to the users
Deployment Managers are also responsible for deploying Airflow and make it accessible to the users
in the way that follows best practices of secure deployment applicable to the organization where
Airflow is deployed. This includes but is not limited to:

Expand Down
4 changes: 2 additions & 2 deletions airflow-core/docs/security/workload.rst
Original file line number Diff line number Diff line change
Expand Up @@ -29,8 +29,8 @@ instances based on the task's ``run_as_user`` parameter, which takes a user's na
**NOTE:** For impersonations to work, Airflow requires ``sudo`` as subtasks are run
with ``sudo -u`` and permissions of files are changed. Furthermore, the unix user
needs to exist on the worker. Here is what a simple sudoers file entry could look
like to achieve this, assuming airflow is running as the ``airflow`` user. This means
the airflow user must be trusted and treated the same way as the root user.
like to achieve this, assuming Airflow is running as the ``airflow`` user. This means
the Airflow user must be trusted and treated the same way as the root user.

.. code-block:: none

Expand Down
4 changes: 2 additions & 2 deletions airflow-core/docs/tutorial/pipeline.rst
Original file line number Diff line number Diff line change
Expand Up @@ -163,7 +163,7 @@ Next, we'll download a CSV file, save it locally, and load it into ``employees_t

@task
def get_data():
# NOTE: configure this as appropriate for your airflow environment
# NOTE: configure this as appropriate for your Airflow environment
data_path = "/opt/airflow/dags/files/employees.csv"
os.makedirs(os.path.dirname(data_path), exist_ok=True)

Expand Down Expand Up @@ -280,7 +280,7 @@ Now that we've defined all our tasks, it's time to put them together into a DAG.

@task
def get_data():
# NOTE: configure this as appropriate for your airflow environment
# NOTE: configure this as appropriate for your Airflow environment
data_path = "/opt/airflow/dags/files/employees.csv"
os.makedirs(os.path.dirname(data_path), exist_ok=True)

Expand Down
2 changes: 1 addition & 1 deletion airflow-core/newsfragments/40029.significant.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Removed deprecated airflow configuration ``webserver.allow_raw_html_descriptions`` from UI Trigger forms.
Removed deprecated Airflow configuration ``webserver.allow_raw_html_descriptions`` from UI Trigger forms.

* Types of change

Expand Down
2 changes: 1 addition & 1 deletion airflow-core/newsfragments/41564.significant.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Move all time operators and sensors from airflow core to standard provider
Move all time operators and sensors from Airflow core to standard provider

* Types of change

Expand Down
2 changes: 1 addition & 1 deletion airflow-core/newsfragments/42252.significant.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Move bash operators from airflow core to standard provider
Move bash operators from Airflow core to standard provider

* Types of change

Expand Down
2 changes: 1 addition & 1 deletion airflow-core/newsfragments/47441.significant.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
There are no more production bundle or devel extras

There are no more production ``all*`` or ``devel*`` bundle extras available in ``wheel`` package of airflow.
If you want to install airflow with all extras you can use ``uv pip install --all-extras`` command.
If you want to install Airflow with all extras you can use ``uv pip install --all-extras`` command.

* Types of change

Expand Down
4 changes: 2 additions & 2 deletions airflow-core/newsfragments/48223.significant.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,10 @@ already existing ``providers`` and the dependencies are isolated and simplified
packages.

While the original installation methods via ``apache-airflow`` distribution package and extras still
work as previously and it installs complete airflow installation ready to serve as scheduler, webserver, triggerer
work as previously and it installs complete Airflow installation ready to serve as scheduler, webserver, triggerer
and worker, the ``apache-airflow`` package is now a meta-package that cab install all the other distribution
packages (mandatory of via optional extras), it's also possible to install only the distribution
packages that are needed for a specific component you want to run airflow with.
packages that are needed for a specific component you want to run Airflow with.

One change vs. Airflow 2 is that neither ``apache-airflow`` nor ``apache-airflow-core`` distribution packages
have ``leveldb`` extra that is an optional feature of ``apache-airflow-providers-google`` distribution package.
Expand Down
2 changes: 1 addition & 1 deletion airflow-core/newsfragments/49161.significant.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Removed airflow configuration ``navbar_logo_text_color``
Removed Airflow configuration ``navbar_logo_text_color``

* Types of change

Expand Down
2 changes: 1 addition & 1 deletion airflow-core/newsfragments/template.significant.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

.. Check the type of change that applies to this change
.. Dag changes: requires users to change their dag code
.. Config changes: requires users to change their airflow config
.. Config changes: requires users to change their Airflow config
.. API changes: requires users to change their Airflow REST API calls
.. CLI changes: requires users to change their Airflow CLI usage
.. Behaviour changes: the existing code won't break, but the behavior is different
Expand Down
4 changes: 2 additions & 2 deletions chart/docs/keda.rst
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ of tasks in ``queued`` or ``running`` state.
--namespace keda \
--version "v2.0.0"

Enable for the airflow instance by setting ``workers.keda.enabled=true`` in your
Enable for the Airflow instance by setting ``workers.keda.enabled=true`` in your
helm command or in the ``values.yaml``.

.. code-block:: bash
Expand All @@ -51,7 +51,7 @@ helm command or in the ``values.yaml``.
--set executor=CeleryExecutor \
--set workers.keda.enabled=true

A ``ScaledObject`` and an ``hpa`` will be created in the airflow namespace.
A ``ScaledObject`` and an ``hpa`` will be created in the Airflow namespace.

KEDA will derive the desired number of Celery workers by querying
Airflow metadata database:
Expand Down
2 changes: 1 addition & 1 deletion chart/docs/manage-dag-files.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ When you create new or modify existing DAG files, it is necessary to deploy them
Bake dags in docker image
-------------------------

With this approach, you include your dag files and related code in the airflow image.
With this approach, you include your dag files and related code in the Airflow image.

This method requires redeploying the services in the helm chart with the new docker image in order to deploy the new DAG code. This can work well particularly if DAG code is not expected to change frequently.

Expand Down
2 changes: 1 addition & 1 deletion chart/docs/production-guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -219,7 +219,7 @@ Example to create a Kubernetes Secret from ``kubectl``:

The webserver key is also used to authorize requests to Celery workers when logs are retrieved. The token
generated using the secret key has a short expiry time though - make sure that time on ALL the machines
that you run airflow components on is synchronized (for example using ntpd) otherwise you might get
that you run Airflow components on is synchronized (for example using ntpd) otherwise you might get
"forbidden" errors when the logs are accessed.

Eviction configuration
Expand Down
2 changes: 1 addition & 1 deletion contributing-docs/01_roles_in_airflow_project.rst
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ There are certain expectations from the members of the security team:

* The security team members might inform 3rd parties about fixes, for example in order to assess if the fix
is solving the problem or in order to assess its applicability to be applied by 3rd parties, as soon
as a PR solving the issue is opened in the public airflow repository.
as a PR solving the issue is opened in the public Airflow repository.

* In case of critical security issues, the members of the security team might iterate on a fix in a
private repository and only open the PR in the public repository once the fix is ready to be released,
Expand Down
10 changes: 5 additions & 5 deletions contributing-docs/03_contributors_quick_start.rst
Original file line number Diff line number Diff line change
Expand Up @@ -184,7 +184,7 @@ Setting up virtual-env

sudo apt install openssl sqlite3 default-libmysqlclient-dev libmysqlclient-dev postgresql

If you want to install all airflow providers, more system dependencies might be needed. For example on Debian/Ubuntu
If you want to install all Airflow providers, more system dependencies might be needed. For example on Debian/Ubuntu
like system, this command will install all necessary dependencies that should be installed when you use
``all`` extras while installing airflow.

Expand Down Expand Up @@ -212,7 +212,7 @@ Forking and cloning Project
alt="Forking Apache Airflow project">
</div>

2. Goto your github account's fork of airflow click on ``Code`` you will find the link to your repo
2. Goto your github account's fork of Airflow click on ``Code`` you will find the link to your repo

.. raw:: html

Expand Down Expand Up @@ -450,7 +450,7 @@ see in CI in your local environment.
means that you are inside the Breeze container and ready to run most of the development tasks. You can leave
the environment with ``exit`` and re-enter it with just ``breeze`` command

6. Once you enter the Breeze environment, create airflow tables and users from the breeze CLI. ``airflow db reset``
6. Once you enter the Breeze environment, create Airflow tables and users from the breeze CLI. ``airflow db reset``
is required to execute at least once for Airflow Breeze to get the database/tables created. If you run
tests, however - the test database will be initialized automatically for you

Expand Down Expand Up @@ -580,7 +580,7 @@ Using Breeze
:select-layout tiled


2. Now you can access airflow web interface on your local machine at |http://localhost:28080| with user name ``admin``
2. Now you can access Airflow web interface on your local machine at |http://localhost:28080| with user name ``admin``
and password ``admin``

.. |http://localhost:28080| raw:: html
Expand Down Expand Up @@ -640,7 +640,7 @@ Following are some of important topics of `Breeze documentation <../dev/breeze/d
* `Troubleshooting Breeze environment <../dev/breeze/doc/04_troubleshooting.rst>`__


Installing airflow in the local venv
Installing Airflow in the local venv
------------------------------------

1. It may require some packages to be installed; watch the output of the command to see which ones are missing
Expand Down
2 changes: 1 addition & 1 deletion contributing-docs/05_pull_requests.rst
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ these guidelines:

- Include tests, either as doctests, unit tests, or both, to your pull request.

The airflow repo uses `GitHub Actions <https://help.github.com/en/actions>`__ to
The Airflow repo uses `GitHub Actions <https://help.github.com/en/actions>`__ to
run the tests and `codecov <https://codecov.io/gh/apache/airflow>`__ to track
coverage. You can set up both for free on your fork. It will help you make sure you do not
break the build with your PR and that you help increase coverage.
Expand Down
16 changes: 8 additions & 8 deletions contributing-docs/07_local_virtualenv.rst
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ Please refer to the `Dockerfile.ci <../Dockerfile.ci>`__ for a comprehensive lis
.. note::

As of version 2.8 Airflow follows PEP 517/518 and uses ``pyproject.toml`` file to define build dependencies
and build process and it requires relatively modern versions of packaging tools to get airflow built from
and build process and it requires relatively modern versions of packaging tools to get Airflow built from
local sources or ``sdist`` packages, as PEP 517 compliant build hooks are used to determine dynamic build
dependencies. In case of ``pip`` it means that at least version 22.1.0 is needed (released at the beginning of
2022) to build or install Airflow from sources. This does not affect the ability of installing Airflow from
Expand Down Expand Up @@ -139,7 +139,7 @@ need to change the python version.
Syncing project (including providers) with uv
.............................................

In a project like airflow it's important to have a consistent set of dependencies across all developers.
In a project like Airflow it's important to have a consistent set of dependencies across all developers.
You can use ``uv sync`` to install dependencies from ``pyproject.toml`` file. This will install all
dependencies from the ``pyproject.toml`` file in the current directory - including devel dependencies of
airflow, all providers dependencies.
Expand All @@ -148,7 +148,7 @@ airflow, all providers dependencies.

uv sync

This will synchronize core dependencies of airflow including all optional core dependencies as well as
This will synchronize core dependencies of Airflow including all optional core dependencies as well as
installs sources for all preinstalled providers and their dependencies.

For example this is how you install dependencies for amazon provider, amazon provider sources,
Expand All @@ -165,7 +165,7 @@ packages by running:

uv sync --all-packages

This will synchronize all development extras of airflow and all packages (this might require some additional
This will synchronize all development extras of Airflow and all packages (this might require some additional
system dependencies to be installed - depending on your OS requirements).

Working on airflow-core only
Expand Down Expand Up @@ -215,7 +215,7 @@ command - or alternatively running ``uv run`` in the provider directory.:
Note that the ``uv sync`` command will automatically synchronize all dependencies needed for your provider
and it's development dependencies.

Creating and installing airflow with other build-frontends
Creating and installing Airflow with other build-frontends
----------------------------------------------------------

While ``uv`` uses ``workspace`` feature to synchronize both Airflow and Providers in a single sync
Expand All @@ -234,7 +234,7 @@ In Airflow 3.0 we moved each provider to a separate sub-folder in "providers" di
providers is a separate distribution with its own ``pyproject.toml`` file. The ``uv workspace`` feature allows
to install all the distributions together and work together on all or only selected providers.

When you install airflow from sources using editable install you only install airflow now, but as described
When you install Airflow from sources using editable install you only install Airflow now, but as described
in the previous chapter, you can develop together both - main version of Airflow and providers of your choice,
which is pretty convenient, because you can use the same environment for both.

Expand Down Expand Up @@ -284,7 +284,7 @@ all basic devel requirements and requirements of google provider as last success


In the future we will utilise ``uv.lock`` to manage dependencies and constraints, but for the moment we do not
commit ``uv.lock`` file to airflow repository because we need to figure out automation of updating the ``uv.lock``
commit ``uv.lock`` file to Airflow repository because we need to figure out automation of updating the ``uv.lock``
very frequently (few times a day sometimes). With Airflow's 700+ dependencies it's all but guaranteed that we
will have 3-4 changes a day and currently automated constraints generation mechanism in ``canary`` build keeps
constraints updated, but for ASF policy reasons we cannot update ``uv.lock`` in the same way - but work is in
Expand All @@ -300,7 +300,7 @@ succeeds. Usually what works in this case is running your install command withou

You can upgrade just airflow, without paying attention to provider's dependencies by using
the 'constraints-no-providers' constraint files. This allows you to keep installed provider dependencies
and install to latest supported ones by pure airflow core.
and install to latest supported ones by pure Airflow core.

.. code:: bash

Expand Down
2 changes: 1 addition & 1 deletion contributing-docs/08_static_code_checks.rst
Original file line number Diff line number Diff line change
Expand Up @@ -485,7 +485,7 @@ Mypy checks
When we run mypy checks locally when committing a change, one of the ``mypy-*`` checks is run, ``mypy-airflow``,
``mypy-dev``, ``mypy-providers``, ``mypy-airflow-ctl``, depending on the files you are changing. The mypy checks
are run by passing those changed files to mypy. This is way faster than running checks for all files (even
if mypy cache is used - especially when you change a file in airflow core that is imported and used by many
if mypy cache is used - especially when you change a file in Airflow core that is imported and used by many
files). However, in some cases, it produces different results than when running checks for the whole set
of files, because ``mypy`` does not even know that some types are defined in other files and it might not
be able to follow imports properly if they are dynamic. Therefore in CI we run ``mypy`` check for whole
Expand Down
4 changes: 2 additions & 2 deletions contributing-docs/09_testing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -44,10 +44,10 @@ includes:
* `System tests <testing/system_tests.rst>`__ are automatic tests that use external systems like
Google Cloud and AWS. These tests are intended for an end-to-end DAG execution.

You can also run other kinds of tests when you are developing airflow packages:
You can also run other kinds of tests when you are developing Airflow packages:

* `Testing distributions <testing/testing_distributions.rst>`__ is a document that describes how to
manually build and test pre-release candidate distributions of airflow and providers.
manually build and test pre-release candidate distributions of Airflow and providers.

* `Python client tests <testing/python_client_tests.rst>`__ are tests we run to check if the Python API
client works correctly.
Expand Down
Loading