Import DMAP data into a PostgreSQL database
- Install ASDF, Python3, Postgres
# for docker
brew install docker docker-compose docker-buildx
brew install asdf- run
asdf installto install tools via asdf - run
poetry config virtualenvs.in-project trueto install venv in folder - run
poetry installto install python dependencies - run
cp .env.template .envand fill out any missing environment variables - run
docker-compose buildto build the docker images for local testing- run
docker-compose up cubic_local_rdsto stand up a local postgres db- this imnage could be used when running pytest
- run
docker-compose up cubic_importto run the importer application
- run
- Navigate to repository directory.
- Update
.envvariable. Source itsource .env. - Run
poetry run startto run the ingestion process. - Run
psql postgresql://postgres:postgres@127.0.0.1:5432/cubic_loaderto get into the database. Alternatively, afterdocker-compose up, you can:docker exec -it dmap_local_rds bashpsql -U postgres -d cubic_loader
- Run format, type and lint checker:
poetry run black .poetry run mypy .poetry run pylint src tests
- Run tests,
poetry run pytest.
- run:
poetry run alembic revision -m "adding a new column"- Rename generated file so as to sort migrations by name by prepending '0xx_'
poetry run alembic upgrade head
New changes are automatically deployed to staging when changes are merged into main workflow.
To deploy changes to prod, you must add a tag, which will automatically trigger the deployment. You cannot manually trigger this workflow. To add a tag to the current state, figure out the new tag name per the guidelines and run:
git tag v1.3.22 # or your tag name
git push origin --tags