Skip to content

Commit

Permalink
Attempt to run unit tests in github actions
Browse files Browse the repository at this point in the history
Airflow needs the db to be intialised even if DAGS aren't actually run.

This github actions workflow creates a postgres service container[1],
and then runs initdb before running pytest.

However, I can't use act to test this locally, because it doesn't
support service containers[2].

[1]. https://docs.github.com/en/free-pro-team@latest/actions/guides/creating-postgresql-service-containers
[2]. nektos/act#173
  • Loading branch information
MatMoore committed Dec 31, 2020
1 parent 31e3e9a commit 140045d
Show file tree
Hide file tree
Showing 3 changed files with 54 additions and 7 deletions.
55 changes: 51 additions & 4 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,17 +5,64 @@ on: [push]
jobs:
build:
runs-on: ubuntu-latest
container: python:3.8.7

services:
postgres:
image: postgres:10.8
env:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: postgres
ports:
- 5432:5432
# needed because the postgres container does not provide a healthcheck
options: --health-cmd pg_isready --health-interval 10s --health-timeout 5s --health-retries 5

steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: 3.8.6

- name: Install postgresql-client package
run: |
apt-get update
apt-get install --yes --no-install-recommends postgresql-client
- name: Create database
run: |
createdb airflow
env:
PGUSER: postgres
PGPASSWORD: postgres
PGHOST: postgres

- name: Install dependencies
run: |
python -m venv env
. env/bin activate
pip --version
pip install wheel
pip install -r requirements.txt --use-deprecated=legacy-resolver
pip list
- name: db init
env:
AIRFLOW__CORE__UNIT_TEST_MODE: True
AIRFLOW__CORE__DAGS_FOLDER: src/dags
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgres://postgres:postgres@postgres/airflow
AIRFLOW__CORE__EXECUTOR: LocalExecutor
run: |
. env/bin activate
airflow db init
- name: Test with pytest
run: |
. env/bin activate
pytest
env:
AIRFLOW__CORE__UNIT_TEST_MODE: True
AIRFLOW__CORE__DAGS_FOLDER: src/dags
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgres://postgres:postgres@postgres/airflow
AIRFLOW__CORE__EXECUTOR: LocalExecutor


2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ executor = LocalExecutor

By default `dags_folder` will be set to `$AIRFLOW_HOME/dags`. To run these dags, configure it to point to the `src/dags` directory ([either in the airflow config or via environment variables](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#dags-folder)).

Run `airflow initdb`
Run `airflow db init`

### Database setup
Create a postgres database for all the data to go into. E.g. on ubuntu:
Expand Down
4 changes: 2 additions & 2 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
alembic==1.4.3
apache-airflow>=1.10.12
apache-airflow==1.10.13
apispec==1.3.3
appdirs==1.4.4
argcomplete==1.12.1
attrs>=20.1.0
attrs==20.3.0
Babel==2.8.1
CacheControl==0.12.6
cached-property==1.5.2
Expand Down

0 comments on commit 140045d

Please sign in to comment.