Skip to content

Commit 7fa85e2

Browse files
authored
Fix simple grammar mistakes in doc (#51138)
* Fix simple grammar mistakes in doc "it's" -> "its" when it should be
1 parent 9068ada commit 7fa85e2

File tree

10 files changed

+12
-12
lines changed

10 files changed

+12
-12
lines changed

ISSUE_TRIAGE_PROCESS.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -196,7 +196,7 @@ associated with them such as ``provider:amazon-aws``, ``provider:microsoft-azure
196196
These make it easier for developers working on a single provider to
197197
track issues for that provider.
198198

199-
Note: each provider has it's own unique label. It is possible for issue to be tagged with more than 1 provider label.
199+
Note: each provider has its own unique label. It is possible for issue to be tagged with more than 1 provider label.
200200

201201
Most issues need a combination of "kind" and "area" labels to be actionable.
202202
For example:

airflow-core/docs/tutorial/fundamentals.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ each line in detail.
4343
Understanding the DAG Definition File
4444
-------------------------------------
4545
Think of the Airflow Python script as a configuration file that lays out the structure of your DAG in code. The actual
46-
tasks you define here run in a different environment, which means this script isn't meant for data processing. It's main
46+
tasks you define here run in a different environment, which means this script isn't meant for data processing. Its main
4747
job is to define the DAG object, and it needs to evaluate quickly since the DAG File Processor checks it regularly for
4848
any changes.
4949

contributing-docs/03_contributors_quick_start.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -173,7 +173,7 @@ Setting up virtual-env
173173
----------------------
174174

175175
1. While you can use any virtualenv manager, we recommend using `UV <https://github.com/astral-sh/uv>`__
176-
as your build and integration frontend. You can read more about UV and it's use in
176+
as your build and integration frontend. You can read more about UV and its use in
177177
Airflow in `Local virtualenv <07_local_virtualenv.rst>`_.
178178

179179
2. After creating the environment, you need to install a few more required packages for Airflow. The below command adds

contributing-docs/07_local_virtualenv.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -203,7 +203,7 @@ the provider's folder and running ``uv sync`` there. For example, to install dep
203203
uv sync
204204
205205
This will use the ``.venv`` environment in the root of your project and will install dependency of your
206-
provider and providers it depends on and it's development dependencies.
206+
provider and providers it depends on and its development dependencies.
207207

208208
Then running tests for the provider is as simple as activating the venv in the main repo and running pytest
209209
command - or alternatively running ``uv run`` in the provider directory.:
@@ -213,7 +213,7 @@ command - or alternatively running ``uv run`` in the provider directory.:
213213
uv run pytest
214214
215215
Note that the ``uv sync`` command will automatically synchronize all dependencies needed for your provider
216-
and it's development dependencies.
216+
and its development dependencies.
217217

218218
Creating and installing Airflow with other build-frontends
219219
----------------------------------------------------------

contributing-docs/quick-start-ide/contributors_quick_start_pycharm.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ airflow core. If you plan to work on providers, at this time you can install dep
5151
5252
$ uv sync --all-packages
5353
54-
Or for specific provider and it's cross-provider dependencies:
54+
Or for specific provider and its cross-provider dependencies:
5555

5656
.. code-block:: bash
5757

contributing-docs/testing/unit_tests.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1343,7 +1343,7 @@ to figure out one of the problems:
13431343
dependency group for the provider - sometimes tests need another provider to be installed that is not
13441344
normally needed as required dependencies of the provider being tested. Those dependencies
13451345
should be added after the ``# Additional devel dependencies`` comment in case of providers. Adding the
1346-
dependencies here means that when ``uv sync`` is run, the packages and it's dependencies will be installed.
1346+
dependencies here means that when ``uv sync`` is run, the packages and its dependencies will be installed.
13471347

13481348
.. code-block:: toml
13491349

dev/breeze/doc/03_developer_tasks.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -442,7 +442,7 @@ Breeze uses docker images heavily and those images are rebuild periodically and
442442
images in docker cache. This might cause extra disk usage. Also running various docker compose commands
443443
(for example running tests with ``breeze testing core-tests``) might create additional docker networks that might
444444
prevent new networks from being created. Those networks are not removed automatically by docker-compose.
445-
Also Breeze uses it's own cache to keep information about all images.
445+
Also Breeze uses its own cache to keep information about all images.
446446

447447
All those unused images, networks and cache can be removed by running ``breeze cleanup`` command. By default
448448
it will not remove the most recent images that you might need to run breeze commands, but you
@@ -461,7 +461,7 @@ These are all available flags of ``cleanup`` command:
461461
Database and config volumes in Breeze
462462
-------------------------------------
463463

464-
Breeze keeps data for all it's integration, database, configuration in named docker volumes.
464+
Breeze keeps data for all its integration, database, configuration in named docker volumes.
465465
Those volumes are persisted until ``breeze down`` command. You can also preserve the volumes by adding
466466
flag ``--preserve-volumes`` when you run the command. Then, next time when you start Breeze,
467467
it will have the data pre-populated.

providers/odbc/docs/changelog.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -122,7 +122,7 @@ Misc
122122
Features
123123
~~~~~~~~
124124

125-
* ``refactor: OdbcHook must use it's own connection when creating a sqlalchemy engine (#43145)``
125+
* ``refactor: OdbcHook must use its own connection when creating a sqlalchemy engine (#43145)``
126126

127127

128128
.. Below changes are excluded from the changelog. Move them to

providers/openlineage/provider.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,7 @@ config:
7474
openlineage:
7575
description: |
7676
This section applies settings for OpenLineage integration.
77-
More about configuration and it's precedence can be found in the `user's guide
77+
More about configuration and its precedence can be found in the `user's guide
7878
<https://airflow.apache.org/docs/apache-airflow-providers-openlineage/stable/guides/user.html#transport-setup>`_.
7979
8080
options:

providers/openlineage/src/airflow/providers/openlineage/get_provider_info.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ def get_provider_info():
4242
],
4343
"config": {
4444
"openlineage": {
45-
"description": "This section applies settings for OpenLineage integration.\nMore about configuration and it's precedence can be found in the `user's guide\n<https://airflow.apache.org/docs/apache-airflow-providers-openlineage/stable/guides/user.html#transport-setup>`_.\n",
45+
"description": "This section applies settings for OpenLineage integration.\nMore about configuration and its precedence can be found in the `user's guide\n<https://airflow.apache.org/docs/apache-airflow-providers-openlineage/stable/guides/user.html#transport-setup>`_.\n",
4646
"options": {
4747
"disabled": {
4848
"description": "Disable sending events without uninstalling the OpenLineage Provider by setting this to true.\n",

0 commit comments

Comments
 (0)