Skip to content

Commit

Permalink
Attempt to run unit tests in github actions
Browse files Browse the repository at this point in the history
Airflow needs the db to be intialised even if DAGS aren't actually run.

This github actions workflow creates a postgres service container[1],
and then runs initdb before running pytest.

However, I can't use act to test this locally, because it doesn't
support service containers[2].

[1]. https://docs.github.com/en/free-pro-team@latest/actions/guides/creating-postgresql-service-containers
[2]. nektos/act#173
  • Loading branch information
MatMoore committed Dec 31, 2020
1 parent 31e3e9a commit dcc3912
Show file tree
Hide file tree
Showing 2 changed files with 51 additions and 6 deletions.
55 changes: 50 additions & 5 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,17 +5,62 @@ on: [push]
jobs:
build:
runs-on: ubuntu-latest
container: python:3.8.7

services:
postgres:
image: postgres:10.8
env:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: postgres
ports:
- 5432:5432
# needed because the postgres container does not provide a healthcheck
options: --health-cmd pg_isready --health-interval 10s --health-timeout 5s --health-retries 5

steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: 3.8.6

- name: Install postgresql-client package
run: |
apt-get update
apt-get install --yes --no-install-recommends postgresql-client
- name: Create database
run: |
createdb airflow
env:
PGUSER: postgres
PGPASSWORD: postgres
PGHOST: postgres

- name: Install dependencies
run: |
python -m venv env
./env/bin activate
pip --version
pip install wheel
pip install -r requirements.txt --use-deprecated=legacy-resolver
pip list
- name: db init
env:
AIRFLOW__CORE__UNIT_TEST_MODE: True
AIRFLOW__CORE__DAGS_FOLDER: src/dags
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgres://postgres:postgres@postgres/airflow
AIRFLOW__CORE__EXECUTOR: LocalExecutor
run: |
./env/bin/airflow db init
- name: Test with pytest
run: |
pytest
./env/bin/pytest
env:
AIRFLOW__CORE__UNIT_TEST_MODE: True
AIRFLOW__CORE__DAGS_FOLDER: src/dags
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgres://postgres:postgres@postgres/airflow
AIRFLOW__CORE__EXECUTOR: LocalExecutor


2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ executor = LocalExecutor

By default `dags_folder` will be set to `$AIRFLOW_HOME/dags`. To run these dags, configure it to point to the `src/dags` directory ([either in the airflow config or via environment variables](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#dags-folder)).

Run `airflow initdb`
Run `airflow db init`

### Database setup
Create a postgres database for all the data to go into. E.g. on ubuntu:
Expand Down

0 comments on commit dcc3912

Please sign in to comment.