Skip to content

Commit

Permalink
Minor doc fixes (#14547)
Browse files Browse the repository at this point in the history
  • Loading branch information
XD-DENG authored Mar 1, 2021
1 parent eee4876 commit 391baee
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 7 deletions.
12 changes: 6 additions & 6 deletions docs/apache-airflow/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ issues from ``pip`` 20.3.0 release have been fixed in 20.3.3). In order to insta
either downgrade pip to version 20.2.4 ``pip install --upgrade pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
``--use-deprecated legacy-resolver`` to your pip install command.

While they are some successes with using other tools like `poetry <https://python-poetry.org/>`_ or
While there are some successes with using other tools like `poetry <https://python-poetry.org/>`_ or
`pip-tools <https://pypi.org/project/pip-tools/>`_, they do not share the same workflow as
``pip`` - especially when it comes to constraint vs. requirements management.
Installing via ``Poetry`` or ``pip-tools`` is not currently supported. If you wish to install airflow
Expand All @@ -81,8 +81,8 @@ environment. For instance, if you don't need connectivity with Postgres,
you won't have to go through the trouble of installing the ``postgres-devel``
yum package, or whatever equivalent applies on the distribution you are using.

Most of the extra dependencies are linked to a corresponding providers package. For example "amazon" extra
has a corresponding ``apache-airflow-providers-amazon`` providers package to be installed. When you install
Most of the extra dependencies are linked to a corresponding provider package. For example "amazon" extra
has a corresponding ``apache-airflow-providers-amazon`` provider package to be installed. When you install
Airflow with such extras, the necessary provider packages are installed automatically (latest versions from
PyPI for those packages). However you can freely upgrade and install provider packages independently from
the main Airflow installation.
Expand All @@ -96,7 +96,7 @@ Provider packages

Unlike Apache Airflow 1.10, the Airflow 2.0 is delivered in multiple, separate, but connected packages.
The core of Airflow scheduling system is delivered as ``apache-airflow`` package and there are around
60 providers packages which can be installed separately as so called ``Airflow Provider packages``.
60 provider packages which can be installed separately as so called ``Airflow Provider packages``.
The default Airflow installation doesn't have many integrations and you have to install them yourself.

You can even develop and install your own providers for Airflow. For more information,
Expand Down Expand Up @@ -164,9 +164,9 @@ In order to have repeatable installation, starting from **Airflow 1.10.10** and
``constraints-master``, ``constraints-2-0`` and ``constraints-1-10`` orphan branches and then we create tag
for each released version e.g. ``constraints-2.0.1``. This way, when we keep a tested and working set of dependencies.

Those "known-to-be-working" constraints are per major/minor python version. You can use them as constraint
Those "known-to-be-working" constraints are per major/minor Python version. You can use them as constraint
files when installing Airflow from PyPI. Note that you have to specify correct Airflow version
and python versions in the URL.
and Python versions in the URL.

You can create the URL to the file substituting the variables in the template below.

Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/start/local.rst
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ the ``Admin->Configuration`` menu. The PID file for the webserver will be stored
in ``$AIRFLOW_HOME/airflow-webserver.pid`` or in ``/run/airflow/webserver.pid``
if started by systemd.

Out of the box, Airflow uses a sqlite database, which you should outgrow
Out of the box, Airflow uses a SQLite database, which you should outgrow
fairly quickly since no parallelization is possible using this database
backend. It works in conjunction with the
:class:`~airflow.executors.sequential_executor.SequentialExecutor` which will
Expand Down

0 comments on commit 391baee

Please sign in to comment.