Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion ISSUE_TRIAGE_PROCESS.rst
Original file line number Diff line number Diff line change
Expand Up @@ -196,7 +196,7 @@ associated with them such as ``provider:amazon-aws``, ``provider:microsoft-azure
These make it easier for developers working on a single provider to
track issues for that provider.

Note: each provider has it's own unique label. It is possible for issue to be tagged with more than 1 provider label.
Note: each provider has its own unique label. It is possible for issue to be tagged with more than 1 provider label.

Most issues need a combination of "kind" and "area" labels to be actionable.
For example:
Expand Down
2 changes: 1 addition & 1 deletion airflow-core/docs/tutorial/fundamentals.rst
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ each line in detail.
Understanding the DAG Definition File
-------------------------------------
Think of the Airflow Python script as a configuration file that lays out the structure of your DAG in code. The actual
tasks you define here run in a different environment, which means this script isn't meant for data processing. It's main
tasks you define here run in a different environment, which means this script isn't meant for data processing. Its main
job is to define the DAG object, and it needs to evaluate quickly since the DAG File Processor checks it regularly for
any changes.

Expand Down
2 changes: 1 addition & 1 deletion contributing-docs/03_contributors_quick_start.rst
Original file line number Diff line number Diff line change
Expand Up @@ -173,7 +173,7 @@ Setting up virtual-env
----------------------

1. While you can use any virtualenv manager, we recommend using `UV <https://github.com/astral-sh/uv>`__
as your build and integration frontend. You can read more about UV and it's use in
as your build and integration frontend. You can read more about UV and its use in
Airflow in `Local virtualenv <07_local_virtualenv.rst>`_.

2. After creating the environment, you need to install a few more required packages for Airflow. The below command adds
Expand Down
4 changes: 2 additions & 2 deletions contributing-docs/07_local_virtualenv.rst
Original file line number Diff line number Diff line change
Expand Up @@ -203,7 +203,7 @@ the provider's folder and running ``uv sync`` there. For example, to install dep
uv sync

This will use the ``.venv`` environment in the root of your project and will install dependency of your
provider and providers it depends on and it's development dependencies.
provider and providers it depends on and its development dependencies.

Then running tests for the provider is as simple as activating the venv in the main repo and running pytest
command - or alternatively running ``uv run`` in the provider directory.:
Expand All @@ -213,7 +213,7 @@ command - or alternatively running ``uv run`` in the provider directory.:
uv run pytest

Note that the ``uv sync`` command will automatically synchronize all dependencies needed for your provider
and it's development dependencies.
and its development dependencies.

Creating and installing Airflow with other build-frontends
----------------------------------------------------------
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ airflow core. If you plan to work on providers, at this time you can install dep

$ uv sync --all-packages

Or for specific provider and it's cross-provider dependencies:
Or for specific provider and its cross-provider dependencies:

.. code-block:: bash

Expand Down
2 changes: 1 addition & 1 deletion contributing-docs/testing/unit_tests.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1343,7 +1343,7 @@ to figure out one of the problems:
dependency group for the provider - sometimes tests need another provider to be installed that is not
normally needed as required dependencies of the provider being tested. Those dependencies
should be added after the ``# Additional devel dependencies`` comment in case of providers. Adding the
dependencies here means that when ``uv sync`` is run, the packages and it's dependencies will be installed.
dependencies here means that when ``uv sync`` is run, the packages and its dependencies will be installed.

.. code-block:: toml

Expand Down
4 changes: 2 additions & 2 deletions dev/breeze/doc/03_developer_tasks.rst
Original file line number Diff line number Diff line change
Expand Up @@ -442,7 +442,7 @@ Breeze uses docker images heavily and those images are rebuild periodically and
images in docker cache. This might cause extra disk usage. Also running various docker compose commands
(for example running tests with ``breeze testing core-tests``) might create additional docker networks that might
prevent new networks from being created. Those networks are not removed automatically by docker-compose.
Also Breeze uses it's own cache to keep information about all images.
Also Breeze uses its own cache to keep information about all images.

All those unused images, networks and cache can be removed by running ``breeze cleanup`` command. By default
it will not remove the most recent images that you might need to run breeze commands, but you
Expand All @@ -461,7 +461,7 @@ These are all available flags of ``cleanup`` command:
Database and config volumes in Breeze
-------------------------------------

Breeze keeps data for all it's integration, database, configuration in named docker volumes.
Breeze keeps data for all its integration, database, configuration in named docker volumes.
Those volumes are persisted until ``breeze down`` command. You can also preserve the volumes by adding
flag ``--preserve-volumes`` when you run the command. Then, next time when you start Breeze,
it will have the data pre-populated.
Expand Down
2 changes: 1 addition & 1 deletion providers/odbc/docs/changelog.rst
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,7 @@ Misc
Features
~~~~~~~~

* ``refactor: OdbcHook must use it's own connection when creating a sqlalchemy engine (#43145)``
* ``refactor: OdbcHook must use its own connection when creating a sqlalchemy engine (#43145)``


.. Below changes are excluded from the changelog. Move them to
Expand Down
2 changes: 1 addition & 1 deletion providers/openlineage/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ config:
openlineage:
description: |
This section applies settings for OpenLineage integration.
More about configuration and it's precedence can be found in the `user's guide
More about configuration and its precedence can be found in the `user's guide
<https://airflow.apache.org/docs/apache-airflow-providers-openlineage/stable/guides/user.html#transport-setup>`_.

options:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ def get_provider_info():
],
"config": {
"openlineage": {
"description": "This section applies settings for OpenLineage integration.\nMore about configuration and it's precedence can be found in the `user's guide\n<https://airflow.apache.org/docs/apache-airflow-providers-openlineage/stable/guides/user.html#transport-setup>`_.\n",
"description": "This section applies settings for OpenLineage integration.\nMore about configuration and its precedence can be found in the `user's guide\n<https://airflow.apache.org/docs/apache-airflow-providers-openlineage/stable/guides/user.html#transport-setup>`_.\n",
"options": {
"disabled": {
"description": "Disable sending events without uninstalling the OpenLineage Provider by setting this to true.\n",
Expand Down
Loading