Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion airflow/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
"""
from __future__ import annotations

__version__ = "2.6.0.dev0"
__version__ = "2.7.0.dev0"

# flake8: noqa: F401

Expand Down
10 changes: 10 additions & 0 deletions airflow/config_templates/default_airflow.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -250,6 +250,13 @@ daemon_umask = 0o077
# Example: dataset_manager_kwargs = {{"some_param": "some_value"}}
# dataset_manager_kwargs =

# (experimental) Whether components should use Airflow Internal API for DB connectivity.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we have this change, this seems unrelated ?

Copy link
Member Author

@potiuk potiuk Apr 29, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Quite the contrary. They are very much related. They were (correctly) automatically added by pre-commit.

Those parameters have version-added = 2.7.0. This was implemented as part of the feature flags for AIP-44 and AIP-52, but in order to not comploicate it further we restricted automated moving parameters from .yaml to config based on "version-added" > currrent version. We moved the related parameters that were scheduled for 2.7 to have "version-added=2.7.0" so they started to appear only when we moved the version to 2.7.0.

database_access_isolation = False

# (experimental) Airflow Internal API url. Only used if [core] database_access_isolation is True.
# Example: internal_api_url = http://localhost:8080
# internal_api_url =

[database]
# The SqlAlchemy connection string to the metadata database.
# SqlAlchemy supports many different database engines.
Expand Down Expand Up @@ -857,6 +864,9 @@ audit_view_excluded_events = gantt,landing_times,tries,duration,calendar,graph,g
# Boolean for running SwaggerUI in the webserver.
enable_swagger_ui = True

# Boolean for running Internal API in the webserver.
run_internal_api = False

# Boolean for enabling rate limiting on authentication endpoints.
auth_rate_limited = True

Expand Down
10 changes: 5 additions & 5 deletions docs/docker-stack/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,12 +31,12 @@ Every time a new version of Airflow is released, the images are prepared in the
[apache/airflow DockerHub](https://hub.docker.com/r/apache/airflow)
for all the supported Python versions.

You can find the following images there (Assuming Airflow version `2.6.0.dev0`):
You can find the following images there (Assuming Airflow version `2.7.0.dev0`):

* `apache/airflow:latest` - the latest released Airflow image with default Python version (3.7 currently)
* `apache/airflow:latest-pythonX.Y` - the latest released Airflow image with specific Python version
* `apache/airflow:2.6.0.dev0` - the versioned Airflow image with default Python version (3.7 currently)
* `apache/airflow:2.6.0.dev0-pythonX.Y` - the versioned Airflow image with specific Python version
* `apache/airflow:2.7.0.dev0` - the versioned Airflow image with default Python version (3.7 currently)
* `apache/airflow:2.7.0.dev0-pythonX.Y` - the versioned Airflow image with specific Python version

Those are "reference" regular images. They contain the most common set of extras, dependencies and providers that are
often used by the users and they are good to "try-things-out" when you want to just take Airflow for a spin,
Expand All @@ -47,8 +47,8 @@ via [Building the image](https://airflow.apache.org/docs/docker-stack/build.html

* `apache/airflow:slim-latest` - the latest released Airflow image with default Python version (3.7 currently)
* `apache/airflow:slim-latest-pythonX.Y` - the latest released Airflow image with specific Python version
* `apache/airflow:slim-2.6.0.dev0` - the versioned Airflow image with default Python version (3.7 currently)
* `apache/airflow:slim-2.6.0.dev0-pythonX.Y` - the versioned Airflow image with specific Python version
* `apache/airflow:slim-2.7.0.dev0` - the versioned Airflow image with default Python version (3.7 currently)
* `apache/airflow:slim-2.7.0.dev0-pythonX.Y` - the versioned Airflow image with specific Python version

The Apache Airflow image provided as convenience package is optimized for size, and
it provides just a bare minimal set of the extras and dependencies installed and in most cases
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ mkdir -p docker-context-files

cat <<EOF >./docker-context-files/requirements.txt
beautifulsoup4==4.10.0
apache-airflow==2.6.0.dev0
apache-airflow==2.7.0.dev0
EOF

export DOCKER_BUILDKIT=1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@

# This is an example Dockerfile. It is not intended for PRODUCTION use
# [START Dockerfile]
FROM apache/airflow:2.6.0.dev0
FROM apache/airflow:2.7.0.dev0
USER root
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@

# This is an example Dockerfile. It is not intended for PRODUCTION use
# [START Dockerfile]
FROM apache/airflow:2.6.0.dev0
FROM apache/airflow:2.7.0.dev0
USER root
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@

# This is an example Dockerfile. It is not intended for PRODUCTION use
# [START Dockerfile]
FROM apache/airflow:2.6.0.dev0
FROM apache/airflow:2.7.0.dev0
USER root
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,6 @@

# This is an example Dockerfile. It is not intended for PRODUCTION use
# [START Dockerfile]
FROM apache/airflow:2.6.0.dev0
RUN pip install --no-cache-dir lxml apache-airflow==2.6.0.dev0
FROM apache/airflow:2.7.0.dev0
RUN pip install --no-cache-dir lxml apache-airflow==2.7.0.dev0
# [END Dockerfile]
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@

# This is an example Dockerfile. It is not intended for PRODUCTION use
# [START Dockerfile]
FROM apache/airflow:2.6.0.dev0
FROM apache/airflow:2.7.0.dev0
COPY requirements.txt /
RUN pip install --no-cache-dir -r /requirements.txt apache-airflow==2.6.0.dev0
RUN pip install --no-cache-dir -r /requirements.txt apache-airflow==2.7.0.dev0
# [END Dockerfile]
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,6 @@

# This is an example Dockerfile. It is not intended for PRODUCTION use
# [START Dockerfile]
FROM apache/airflow:2.6.0.dev0
RUN pip install --no-cache-dir apache-airflow-providers-docker==2.5.1 apache-airflow==2.6.0.dev0
FROM apache/airflow:2.7.0.dev0
RUN pip install --no-cache-dir apache-airflow-providers-docker==2.5.1 apache-airflow==2.7.0.dev0
# [END Dockerfile]
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@

# This is an example Dockerfile. It is not intended for PRODUCTION use
# [START Dockerfile]
FROM apache/airflow:2.6.0.dev0
FROM apache/airflow:2.7.0.dev0

COPY --chown=airflow:root test_dag.py /opt/airflow/dags

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@

# This is an example Dockerfile. It is not intended for PRODUCTION use
# [START Dockerfile]
FROM apache/airflow:2.6.0.dev0
FROM apache/airflow:2.7.0.dev0
RUN umask 0002; \
mkdir -p ~/writeable-directory
# [END Dockerfile]
18 changes: 9 additions & 9 deletions docs/docker-stack/entrypoint.rst
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,7 @@ if you specify extra arguments. For example:

.. code-block:: bash

docker run -it apache/airflow:2.6.0.dev0-python3.7 bash -c "ls -la"
docker run -it apache/airflow:2.7.0.dev0-python3.7 bash -c "ls -la"
total 16
drwxr-xr-x 4 airflow root 4096 Jun 5 18:12 .
drwxr-xr-x 1 root root 4096 Jun 5 18:12 ..
Expand All @@ -144,21 +144,21 @@ you pass extra parameters. For example:

.. code-block:: bash

> docker run -it apache/airflow:2.6.0.dev0-python3.7 python -c "print('test')"
> docker run -it apache/airflow:2.7.0.dev0-python3.7 python -c "print('test')"
test

If first argument equals to "airflow" - the rest of the arguments is treated as an airflow command
to execute. Example:

.. code-block:: bash

docker run -it apache/airflow:2.6.0.dev0-python3.7 airflow webserver
docker run -it apache/airflow:2.7.0.dev0-python3.7 airflow webserver

If there are any other arguments - they are simply passed to the "airflow" command

.. code-block:: bash

> docker run -it apache/airflow:2.6.0.dev0-python3.7 help
> docker run -it apache/airflow:2.7.0.dev0-python3.7 help
usage: airflow [-h] GROUP_OR_COMMAND ...

positional arguments:
Expand Down Expand Up @@ -206,7 +206,7 @@ propagation (See the next chapter).

.. code-block:: Dockerfile

FROM airflow:2.6.0.dev0
FROM airflow:2.7.0.dev0
COPY my_entrypoint.sh /
ENTRYPOINT ["/usr/bin/dumb-init", "--", "/my_entrypoint.sh"]

Expand Down Expand Up @@ -250,7 +250,7 @@ Similarly to custom entrypoint, it can be added to the image by extending it.

.. code-block:: Dockerfile

FROM airflow:2.6.0.dev0
FROM airflow:2.7.0.dev0
COPY my_after_entrypoint_script.sh /

Build your image and then you can run this script by running the command:
Expand Down Expand Up @@ -363,7 +363,7 @@ database and creating an ``admin/admin`` Admin user with the following command:
--env "_AIRFLOW_DB_UPGRADE=true" \
--env "_AIRFLOW_WWW_USER_CREATE=true" \
--env "_AIRFLOW_WWW_USER_PASSWORD=admin" \
apache/airflow:2.6.0.dev0-python3.8 webserver
apache/airflow:2.7.0.dev0-python3.8 webserver


.. code-block:: bash
Expand All @@ -372,7 +372,7 @@ database and creating an ``admin/admin`` Admin user with the following command:
--env "_AIRFLOW_DB_UPGRADE=true" \
--env "_AIRFLOW_WWW_USER_CREATE=true" \
--env "_AIRFLOW_WWW_USER_PASSWORD_CMD=echo admin" \
apache/airflow:2.6.0.dev0-python3.8 webserver
apache/airflow:2.7.0.dev0-python3.8 webserver

The commands above perform initialization of the SQLite database, create admin user with admin password
and Admin role. They also forward local port ``8080`` to the webserver port and finally start the webserver.
Expand Down Expand Up @@ -412,6 +412,6 @@ Example:
--env "_AIRFLOW_DB_UPGRADE=true" \
--env "_AIRFLOW_WWW_USER_CREATE=true" \
--env "_AIRFLOW_WWW_USER_PASSWORD_CMD=echo admin" \
apache/airflow:2.6.0.dev0-python3.8 webserver
apache/airflow:2.7.0.dev0-python3.8 webserver

This method is only available starting from Docker image of Airflow 2.1.1 and above.