Skip to content

Add support for Python 3.13 #46891

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 5 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
4 changes: 3 additions & 1 deletion .github/actions/post_tests_success/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,9 @@ runs:
uses: codecov/codecov-action@b9fd7d16f6d7d1b5d2bec1a2887e65ceed900238
env:
CODECOV_TOKEN: ${{ inputs.codecov-token }}
if: env.ENABLE_COVERAGE == 'true' && env.TEST_TYPES != 'Helm' && inputs.python-version != '3.12'
if: >
env.ENABLE_COVERAGE == 'true' && env.TEST_TYPES != 'Helm' && inputs.python-version != '3.12'
&& inputs.python-version != '3.13'
with:
name: coverage-${{env.JOB_ID}}
flags: python-${{ env.PYTHON_MAJOR_MINOR_VERSION }},${{ env.BACKEND }}-${{ env.BACKEND_VERSION }}
Expand Down
6 changes: 6 additions & 0 deletions .github/actions/prepare_all_ci_images/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -66,3 +66,9 @@ runs:
platform: ${{ inputs.platform }}
python: "3.12"
python-versions-list-as-string: ${{ inputs.python-versions-list-as-string }}
- name: "Restore CI docker image ${{ inputs.platform }}:3.13"
uses: ./.github/actions/prepare_single_ci_image
with:
platform: ${{ inputs.platform }}
python: "3.13"
python-versions-list-as-string: ${{ inputs.python-versions-list-as-string }}
3 changes: 2 additions & 1 deletion .github/workflows/run-unit-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -179,7 +179,8 @@ jobs:
- name: >
Migration Tests: ${{ matrix.python-version }}:${{ env.PARALLEL_TEST_TYPES }}
uses: ./.github/actions/migration_tests
if: inputs.run-migration-tests == 'true' && inputs.test-group == 'core'
# TODO(potiuk) - we should bring back migration tests when we move the migration to start from Airflow 3
if: inputs.run-migration-tests == 'true' && inputs.test-group == 'core' && matrix.python-version != 3.13
- name: >
${{ inputs.test-group }}:${{ inputs.test-scope }} Tests ${{ inputs.test-name }} ${{ matrix.backend-version }}
Py${{ matrix.python-version }}:${{ env.PARALLEL_TEST_TYPES }}
Expand Down
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -1493,7 +1493,7 @@ COPY --from=scripts install_mysql.sh install_mssql.sh install_postgres.sh /scrip
RUN bash /scripts/docker/install_mysql.sh dev && \
bash /scripts/docker/install_mssql.sh dev && \
bash /scripts/docker/install_postgres.sh dev
ENV PATH=${PATH}:/opt/mssql-tools/bin
ENV PATH=${HOME}/.cargo/bin:${PATH}:/opt/mssql-tools/bin

# By default we do not install from docker context files but if we decide to install from docker context
# files, we should override those variables to "docker-context-files"
Expand Down
4 changes: 2 additions & 2 deletions Dockerfile.ci
Original file line number Diff line number Diff line change
Expand Up @@ -1401,8 +1401,8 @@ ENV AIRFLOW_PIP_VERSION=${AIRFLOW_PIP_VERSION} \
UV_LINK_MODE=copy \
AIRFLOW_PRE_COMMIT_VERSION=${AIRFLOW_PRE_COMMIT_VERSION}

# The PATH is needed for PIPX to find the tools installed
ENV PATH="/root/.local/bin:${PATH}"
# The PATH is needed for PIPX to find the tools installed and cargo to build the wheels
ENV PATH="/root/.local/bin:/root/.cargo/bin:${PATH}"

# Useful for creating a cache id based on the underlying architecture, preventing the use of cached python packages from
# an incorrect architecture.
Expand Down
16 changes: 8 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -99,14 +99,14 @@ Airflow is not a streaming solution, but it is often used to process real-time d

Apache Airflow is tested with:

| | Main version (dev) | Stable version (3.0.2) |
|------------|------------------------|------------------------|
| Python | 3.9, 3.10, 3.11, 3.12 | 3.9, 3.10, 3.11, 3.12 |
| Platform | AMD64/ARM64(\*) | AMD64/ARM64(\*) |
| Kubernetes | 1.30, 1.31, 1.32, 1.33 | 1.30, 1.31, 1.32, 1.33 |
| PostgreSQL | 13, 14, 15, 16, 17 | 13, 14, 15, 16, 17 |
| MySQL | 8.0, 8.4, Innovation | 8.0, 8.4, Innovation |
| SQLite | 3.15.0+ | 3.15.0+ |
| | Main version (dev) | Stable version (3.0.2) |
|------------|-----------------------------|------------------------|
| Python | 3.9, 3.10, 3.11, 3.12, 3.13 | 3.9, 3.10, 3.11, 3.12 |
| Platform | AMD64/ARM64(\*) | AMD64/ARM64(\*) |
| Kubernetes | 1.30, 1.31, 1.32, 1.33 | 1.30, 1.31, 1.32, 1.33 |
| PostgreSQL | 13, 14, 15, 16, 17 | 13, 14, 15, 16, 17 |
| MySQL | 8.0, 8.4, Innovation | 8.0, 8.4, Innovation |
| SQLite | 3.15.0+ | 3.15.0+ |

\* Experimental

Expand Down
2 changes: 1 addition & 1 deletion airflow-core/docs/installation/prerequisites.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Prerequisites

Airflow® is tested with:

* Python: 3.9, 3.10, 3.11, 3.12
* Python: 3.9, 3.10, 3.11, 3.12, 3.13

* Databases:

Expand Down
2 changes: 1 addition & 1 deletion airflow-core/docs/start.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ This quick start guide will help you bootstrap an Airflow standalone instance on

.. note::

Successful installation requires a Python 3 environment. Starting with Airflow 2.7.0, Airflow supports Python 3.9, 3.10, 3.11, and 3.12.
Successful installation requires a Python 3 environment. Starting with Airflow 2.7.0, Airflow supports Python 3.9, 3.10, 3.11, 3.12, 3.13.

Officially supported installation methods is with``pip`.

Expand Down
10 changes: 5 additions & 5 deletions airflow-core/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ name = "apache-airflow-core"
description = "Core packages for Apache Airflow, schedule and API server"
readme = { file = "README.md", content-type = "text/markdown" }
license-files.globs = ["LICENSE", "3rd-party-licenses/*.txt", "NOTICE"]
requires-python = "~=3.9,<3.13"
requires-python = "~=3.9,<3.14"
authors = [
{ name = "Apache Software Foundation", email = "dev@airflow.apache.org" },
]
Expand Down Expand Up @@ -95,8 +95,8 @@ dependencies = [
"jinja2>=3.1.5",
"jsonschema>=4.19.1",
"lazy-object-proxy>=1.2.0",
'libcst >=1.1.0,!=1.8.1;python_version<"3.10"',
'libcst >=1.1.0;python_version>="3.10"',
# Ignore 1.8.1 as it misses typing extensions for Python 3.9
'libcst>=1.5.1,!=1.8.1',
"linkify-it-py>=2.0.0",
"lockfile>=0.12.2",
"methodtools>=0.4.7",
Expand All @@ -116,6 +116,7 @@ dependencies = [
"pathspec>=0.9.0",
'pendulum>=2.1.2,<4.0;python_version<"3.12"',
'pendulum>=3.0.0,<4.0;python_version>="3.12"',
'pendulum>=3.1.0 ; python_version>="3.13"',
"pluggy>=1.5.0",
"psutil>=5.8.0",
"pydantic>=2.11.0",
Expand Down Expand Up @@ -163,7 +164,7 @@ dependencies = [
"async" = [
"eventlet>=0.37.0",
"gevent>=24.2.1",
"greenlet>=0.4.9",
"greenlet>=3.1.0",
]
"graphviz" = [
# The graphviz package creates friction when installing on MacOS as it needs graphviz system package to
Expand Down Expand Up @@ -208,7 +209,6 @@ Mastodon = "https://fosstodon.org/@airflow"
Bluesky = "https://bsky.app/profile/apache-airflow.bsky.social"
YouTube = "https://www.youtube.com/channel/UCSXwxpWZQ7XZ1WL3wqevChA/"


[tool.hatch.version]
path = "src/airflow/__init__.py"

Expand Down
10 changes: 10 additions & 0 deletions airflow-core/src/airflow/api_fastapi/core_api/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@

import logging
import os
import sys
import warnings
from pathlib import Path

Expand All @@ -36,6 +37,8 @@

log = logging.getLogger(__name__)

PY313 = sys.version_info >= (3, 13)


def init_views(app: FastAPI) -> None:
"""Init views by registering the different routers."""
Expand Down Expand Up @@ -118,6 +121,13 @@ def init_flask_plugins(app: FastAPI) -> None:
try:
from airflow.providers.fab.www.app import create_app
except ImportError:
if PY313:
log.info(
"Some Airflow 2 plugins have been detected in your environment. Currently FAB provider "
"does not support Python 3.13, so you cannot use Airflow 2 plugins with Airflow 3 until "
"FAB provider will be Python 3.13 compatible."
)
return
raise AirflowException(
"Some Airflow 2 plugins have been detected in your environment. "
"To run them with Airflow 3, you must install the FAB provider in your Airflow environment."
Expand Down
2 changes: 1 addition & 1 deletion airflow-core/src/airflow/models/connection.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@
# the symbols #,!,-,_,.,:,\,/ and () requiring at least one match.
#
# You can try the regex here: https://regex101.com/r/69033B/1
RE_SANITIZE_CONN_ID = re.compile(r"^[\w\#\!\(\)\-\.\:\/\\]{1,}$")
RE_SANITIZE_CONN_ID = re.compile(r"^[\w#!()\-.:/\\]{1,}$")
# the conn ID max len should be 250
CONN_ID_MAX_LEN: int = 250

Expand Down
1 change: 1 addition & 0 deletions airflow-core/src/airflow/models/dag.py
Original file line number Diff line number Diff line change
Expand Up @@ -1572,6 +1572,7 @@ def create_dagrun(
# This is also done on the DagRun model class, but SQLAlchemy column
# validator does not work well for some reason.
if not re.match(RUN_ID_REGEX, run_id):
# TODO(potiuk): check if it is ok to use regexp from configuration (likely yes)
regex = airflow_conf.get("scheduler", "allowed_run_id_pattern").strip()
if not regex or not re.match(regex, run_id):
raise ValueError(
Expand Down
2 changes: 1 addition & 1 deletion airflow-core/src/airflow/plugins_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@ class EntryPointSource(AirflowPluginSource):
"""Class used to define Plugins loaded from entrypoint."""

def __init__(self, entrypoint: metadata.EntryPoint, dist: metadata.Distribution):
self.dist = dist.metadata["Name"]
self.dist = dist.metadata["Name"] # type: ignore[index]
self.version = dist.version
self.entrypoint = str(entrypoint)

Expand Down
2 changes: 2 additions & 0 deletions airflow-core/src/airflow/providers_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -592,6 +592,8 @@ def _discover_all_providers_from_packages(self) -> None:
and verifies only the subset of fields that are needed at runtime.
"""
for entry_point, dist in entry_points_with_dist("apache_airflow_provider"):
if not dist.metadata:
continue
package_name = canonicalize_name(dist.metadata["name"])
if package_name in self._provider_dict:
continue
Expand Down
19 changes: 17 additions & 2 deletions airflow-core/src/airflow/serialization/serializers/numpy.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@

from typing import TYPE_CHECKING, Any

from packaging import version

from airflow.utils.module_loading import import_string, qualname

# lazy loading for performance reasons
Expand Down Expand Up @@ -74,10 +76,23 @@ def serialize(o: object) -> tuple[U, str, int, bool]:
if isinstance(o, np.bool_):
return bool(np), name, __version__, True

from importlib import metadata

numpy_version = metadata.version("numpy")
is_numpy_2 = version.parse(numpy_version).major >= 2
if isinstance(
o, (np.float_, np.float16, np.float32, np.float64, np.complex_, np.complex64, np.complex128)
o,
(
np.float64 if is_numpy_2 else np.float_, # type: ignore[attr-defined]
np.float16,
np.float32,
np.float64,
np.complex64 if is_numpy_2 else np.complex_, # type: ignore[attr-defined]
np.complex64,
np.complex128,
),
):
return float(o), name, __version__, True
return float(o), name, __version__, True # type: ignore [arg-type]

return "", "", 0, False

Expand Down
4 changes: 2 additions & 2 deletions airflow-core/src/airflow/traces/otel_tracer.py
Original file line number Diff line number Diff line change
Expand Up @@ -267,13 +267,13 @@ def _new_span(
start_time=datetime_to_nano(start_time),
)
else:
span = tracer.start_span(
span = tracer.start_span( # type: ignore[assignment]
name=span_name,
context=parent_context,
links=links,
start_time=datetime_to_nano(start_time),
)
current_span_ctx = trace.set_span_in_context(NonRecordingSpan(span.get_span_context()))
current_span_ctx = trace.set_span_in_context(NonRecordingSpan(span.get_span_context())) # type: ignore[attr-defined]
# We have to manually make the span context as the active context.
# If the span needs to be injected into the carrier, then this is needed to make sure
# that the injected context will point to the span context that was just created.
Expand Down
30 changes: 8 additions & 22 deletions airflow-core/tests/unit/always/test_example_dags.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,12 +87,7 @@ def get_suspended_providers_folders() -> list[str]:
for provider_path in AIRFLOW_PROVIDERS_ROOT_PATH.rglob("provider.yaml"):
provider_yaml = yaml.safe_load(provider_path.read_text())
if provider_yaml["state"] == "suspended":
suspended_providers.append(
provider_path.parent.relative_to(AIRFLOW_ROOT_PATH)
.as_posix()
# TODO(potiuk): check
.replace("providers/src/airflow/providers/", "")
)
suspended_providers.append(provider_path.parent.resolve().as_posix())
return suspended_providers


Expand All @@ -106,12 +101,7 @@ def get_python_excluded_providers_folders() -> list[str]:
provider_yaml = yaml.safe_load(provider_path.read_text())
excluded_python_versions = provider_yaml.get("excluded-python-versions", [])
if CURRENT_PYTHON_VERSION in excluded_python_versions:
excluded_providers.append(
provider_path.parent.relative_to(AIRFLOW_ROOT_PATH)
.as_posix()
# TODO(potiuk): check
.replace("providers/src/airflow/providers/", "")
)
excluded_providers.append(provider_path.parent.resolve().as_posix())
return excluded_providers


Expand All @@ -127,16 +117,6 @@ def example_not_excluded_dags(xfail_db_exception: bool = False):

suspended_providers_folders = get_suspended_providers_folders()
current_python_excluded_providers_folders = get_python_excluded_providers_folders()
suspended_providers_folders = [
AIRFLOW_ROOT_PATH.joinpath(prefix, provider).as_posix()
for prefix in PROVIDERS_PREFIXES
for provider in suspended_providers_folders
]
current_python_excluded_providers_folders = [
AIRFLOW_ROOT_PATH.joinpath(prefix, provider).as_posix()
for prefix in PROVIDERS_PREFIXES
for provider in current_python_excluded_providers_folders
]
providers_folders = tuple([AIRFLOW_ROOT_PATH.joinpath(pp).as_posix() for pp in PROVIDERS_PREFIXES])
for example_dir in example_dirs:
candidates = glob(f"{AIRFLOW_ROOT_PATH.as_posix()}/{example_dir}", recursive=True)
Expand Down Expand Up @@ -192,6 +172,12 @@ def test_should_be_importable(example: str):
dag_folder=example,
include_examples=False,
)
if len(dagbag.import_errors) == 1 and "AirflowOptionalProviderFeatureException" in str(
dagbag.import_errors
):
pytest.skip(
f"Skipping {example} because it requires an optional provider feature that is not installed."
)
assert len(dagbag.import_errors) == 0, f"import_errors={str(dagbag.import_errors)}"
assert len(dagbag.dag_ids) >= 1

Expand Down
59 changes: 0 additions & 59 deletions airflow-core/tests/unit/always/test_pandas.py

This file was deleted.

Loading
Loading