Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

compose and such #949

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 0 additions & 51 deletions .env.example.WIP

This file was deleted.

16 changes: 7 additions & 9 deletions .env.example.dev
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@ ENVIRONMENT=development
API_PORT=8030
SECRET_KEY=test

RUN_MIGRATION=yes
RUN_COMPILEMESSAGES=yes
RUN_LOAD_DUMMY_DATA=no
RUN_COLLECT_STATIC=no
RUN_DEV_SERVER=yes
RUN_MIGRATION=True
RUN_COMPILE_MESSAGES=True
RUN_COLLECT_STATIC=False
RUN_LOAD_INITIAL_DATA=False
RUN_CREATE_SUPER_USER=True

# django
DEBUG=True
Expand All @@ -25,10 +25,8 @@ NO_REPLY_EMAIL=
DEFAULT_FROM_EMAIL=

# admin
SUPER_ADMIN_EMAIL=a@a.co
SUPER_ADMIN_PASS=a
SUPER_ADMIN_FIRST_NAME=abc
SUPER_ADMIN_LAST_NAME=dee
DJANGO_ADMIN_EMAIL=a@a.co
DJANGO_ADMIN_PASSWORD=a

# database
POSTGRES_USER=postgres
Expand Down
50 changes: 24 additions & 26 deletions .env.example.prod
Original file line number Diff line number Diff line change
@@ -1,52 +1,50 @@
# api deployment
ENVIRONMENT=production
API_PORT=8030
SECRET_KEY=
SECRET_KEY=test

RUN_MIGRATION=yes
RUN_COMPILEMESSAGES=yes
RUN_LOAD_DUMMY_DATA=no
RUN_COLLECT_STATIC=yes
RUN_DEV_SERVER=no
RUN_MIGRATION=True
RUN_COMPILE_MESSAGES=True
RUN_COLLECT_STATIC=True
RUN_LOAD_INITIAL_DATA=False
RUN_CREATE_SUPER_USER=True

# django
DEBUG=False
ENABLE_DEBUG_TOOLBAR=False

ALLOWED_HOSTS=localhost
CSRF_TRUSTED_ORIGINS=http://localhost
CORS_ALLOWED_ORIGINS=http://localhost
CORS_ALLOWED_ORIGIN_REGEXES=http://localhost*

SITE_URL=
EMAIL_HOST=
EMAIL_PORT=
EMAIL_PORT=25
EMAIL_HOST_USER=
EMAIL_HOST_PASSWORD=
EMAIL_USE_TLS=
EMAIL_USE_SSL=
EMAIL_USE_TLS=False
EMAIL_USE_SSL=False

NO_REPLY_EMAIL=
DEFAULT_FROM_EMAIL=

ALLOWED_HOSTS=localhost
CSRF_TRUSTED_ORIGINS=http://localhost
CORS_ALLOWED_ORIGINS=http://localhost
CORS_ALLOWED_ORIGIN_REGEXES=http://localhost*

# admin
SUPER_ADMIN_EMAIL=
SUPER_ADMIN_PASS=
SUPER_ADMIN_FIRST_NAME=
SUPER_ADMIN_LAST_NAME=
DJANGO_ADMIN_EMAIL=a@a.co
DJANGO_ADMIN_PASSWORD=a

# database
POSTGRES_USER=
POSTGRES_PASSWORD=
POSTGRES_DB=
DATABASE_URL=postgres://user:password@netloc/database
POSTGRES_USER=postgres
POSTGRES_PASSWORD=secret
POSTGRES_DB=seismic_site
DATABASE_URL=postgres://postgres:secret@db/postgres

GUNICORN_PORT=5000
GUNICORN_WORKERS=10
BACKGROUND_WORKERS=2
GUNICORN_WORKERS=2

# client
REACT_APP_DJANGO_SITE_URL=
REACT_APP_DJANGO_PORT=
REACT_APP_DJANGO_SITE_URL=http://localhost
REACT_APP_DJANGO_PORT=8030
REACT_APP_DJANGO_API_ENDPOINT=api/v1

# external api keys
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ api/public/*

# dotenv environment variables file
.env
.env.dev
.env.test
.env.prod
.env.*
Expand Down
77 changes: 34 additions & 43 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -3,45 +3,36 @@ help: ## Display a help message detailing commands a
@grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-30s\033[0m %s\n", $$1, $$2}'
@echo ""

## [DEV ENV SETUP]
install-docker-ubuntu: ## installs docker and docker-compose on Ubuntu
sudo apt-get remove docker docker-engine docker.io containerd runc
sudo apt-get update
sudo apt-get -y install apt-transport-https ca-certificates curl gnupg-agent software-properties-common
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
sudo apt-key fingerprint 0EBFCD88
sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(shell lsb_release -cs) stable" || { echo "$(shell lsb_release -cs) is not yet supported by docker.com."; exit 1; }
sudo apt-get update
sudo apt-get install -y docker-ce gettext
sudo curl -L "https://github.com/docker/compose/releases/download/v2.2.3/docker-compose-$(shell uname -s)-$(shell uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose

install-docker-osx: ## installs homebrew (you can skip this at runtime), docker and docker-compose on OSX
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
brew update
brew cask install docker
brew install docker-compose gettext

build: ## builds the container
docker-compose build --pull
docker-compose up -d

build-dev: ## builds the container with the development flag
docker-compose build --build-arg ENVIRONMENT=development --pull
docker-compose up -d
docker compose build
docker compose up -d

build-prod: ## builds the container with the production flag
docker compose -f docker-compose.prod.yml build \
--build-arg $$(cat .env.prod | grep ENVIRONMENT) \
--build-arg $$(cat .env.prod | grep REACT_APP_CAPTCHA_API_KEY) \
--build-arg $$(cat .env.prod | grep REACT_APP_HERE_MAPS_API_KEY) \
--build-arg $$(cat .env.prod | grep REACT_APP_DJANGO_SITE_URL) \
--build-arg $$(cat .env.prod | grep REACT_APP_DJANGO_PORT) \
--build-arg $$(cat .env.prod | grep REACT_APP_DJANGO_API_ENDPOINT)
docker compose -f docker-compose.prod.yml up -d

superuser: ## creates a superuser for the API
docker-compose exec api ./manage.py createsuperuser
docker compose exec api ./manage.py createsuperuser

init-db: superuser ## sets up the database and fixtures
docker-compose exec api ./manage.py loaddata statistics
docker-compose exec api ./manage.py loaddata proximal_utilities
docker-compose exec api ./manage.py loaddata work_performed
docker-compose exec api ./manage.py loaddata buildings
docker compose exec api ./manage.py loaddata statistics
docker compose exec api ./manage.py loaddata proximal_utilities
docker compose exec api ./manage.py loaddata work_performed
docker compose exec api ./manage.py loaddata buildings

drop-db-dev: ## drops the containers and removes the database for the development environment
docker compose down -v -t 60

drop-db-prod: ## drops the containers and removes the database for the production environment
docker compose -f docker-compose.prod.yml down -v -t 60

drop-db: ## drops the database
docker-compose down -t 60
docker volume rm seismic-risc_pgdata
drop-db: drop-db-dev drop-db-prod ## drops the containers and removes the database for both environments

redo-db: drop-db init-db ## drops the database, then sets up the database and fixtures

Expand All @@ -53,43 +44,43 @@ requirements-update: ## run pip compile and rebuild the requirement
docker compose run --rm --no-deps --entrypoint "bash -c" api "cd /code && pip-compile --resolver=backtracking -r -U -o requirements.txt requirements.in && pip-compile --resolver=backtracking -r -U -o requirements-dev.txt requirements-dev.in && chmod a+r requirements.txt && chmod a+r requirements-dev.txt"

migrations: ## generate migrations in a clean container
docker-compose exec api ./manage.py makemigrations
docker compose exec api ./manage.py makemigrations

migrate: ## apply migrations in a clean container
docker-compose exec api ./manage.py migrate
docker compose exec api ./manage.py migrate

makemessages: ## generate the strings marked for translation
docker-compose exec api ./manage.py makemessages -a
docker compose exec api ./manage.py makemessages -a

compilemessages: ## compile the translations
docker-compose exec api ./manage.py compilemessages
docker compose exec api ./manage.py compilemessages

messages: makemessages compilemessages

collectstatic:
docker-compose exec api ./manage.py collectstatic --no-input
docker compose exec api ./manage.py collectstatic --no-input

pyshell: ## start a django shell
docker-compose exec api ./manage.py shell
docker compose exec api ./manage.py shell

black: ## run the Black formatter on the Python code
black --line-length 120 --target-version py311 --exclude migrations ./api

## [TEST]
test: ## run all tests
docker-compose run --rm api "pytest"
docker compose run --rm api "pytest"

test-pdb: ## run tests and enter debugger on failed assert or error
docker-compose run --rm api "pytest --pdb"
docker compose run --rm api "pytest --pdb"

test-lf: ## rerun tests that failed last time
docker-compose run --rm api "pytest --lf"
docker compose run --rm api "pytest --lf"

## [CLEAN]
clean: clean-docker clean-py ## remove all build, test, coverage and Python artifacts

clean-docker: ## stop docker containers and remove orphaned images and volumes
docker-compose down -t 60
docker compose down -t 60
docker system prune -f

clean-py: ## remove Python test, coverage, file artifacts, and compiled message files
Expand Down
24 changes: 12 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -185,18 +185,18 @@ Make sure to check the [Environment variables](#environment-variables)
section for info on how to set up the keys before you run the following commands:

```shell
cp .env.example.dev .env
cp .env.example.dev .env.dev
# build the development container
make build-dev
```

If you didn't set up the `RUN_LOAD_DUMMY_DATA` variable, you can add dummy data to the database with the following command:
If you didn't set up the `RUN_LOAD_INITIAL_DATA` variable, you can add dummy data to the database with the following command:

```shell
make init-db
make build-dev
```

If the `RUN_LOAD_DUMMY_DATA` was `yes`, then you should have dummy data but will have to create a superuser:
If the `RUN_LOAD_INITIAL_DATA` was `yes`, then you should have dummy data but will have to create a superuser:

```shell
docker-compose exec api ./manage.py createsuperuser
Expand Down Expand Up @@ -241,7 +241,7 @@ The following variables change the way the API is deployed.
`RUN_MIGRATION`
Run the initial migrations (sets up the data models from the database).

`RUN_LOAD_DUMMY_DATA`
`RUN_LOAD_INITIAL_DATA`
Adds real & dummy data to the database (adds buildings, datafiles, and statistics).

`RUN_COLLECT_STATIC`
Expand All @@ -252,7 +252,7 @@ Runs the application in the development mode.

#### External services API keys

In order to have a fully functional project, you have to get two API keys: HERE Maps API Key and hCAPTCHA API Key.
To have a fully functional project, you have to get two API keys: HERE Maps API Key and hCAPTCHA API Key.

##### HERE Maps API Key

Expand Down Expand Up @@ -309,13 +309,13 @@ docker-compose exec api some_container_command
docker-compose exec client some_container_command
```

In order to see all available commands, run:
To see all available commands, run:

```shell
make help
```

### Starting the project without docker
### Starting the project without Docker

#### Windows platform

Expand All @@ -340,7 +340,7 @@ make help
provide or change. Double check database config line in .env. It has to follow this
pattern: `postgres://USER:PASSWORD@HOST:PORT/NAME`

3. Run following in order to set the needed environment variables:
3. Run following to set the needed environment variables:

```shell
activate_dev_env.bat
Expand All @@ -358,7 +358,7 @@ make help
python api/manage.py migrate --no-input
```

6. Create admin user (user to login into admin pannel):
6. Create admin user (user to login into admin panel):

```shell
python api/manage.py createsuperuser
Expand Down Expand Up @@ -407,7 +407,7 @@ Check functionality at http://localhost:3000.

### Development

When creating new models in Django, in order to make sure they are generated in a clean environment, it is recommended
When creating new models in Django, to make sure they are generated in a clean environment, it is recommended
to generate the migration files using the `make` command:

```shell
Expand Down Expand Up @@ -497,7 +497,7 @@ make test

## Production

In order to get the container ready for production use, we need to first build it:
To get the container ready for production use, we need to first build it:

```shell
docker build -t seismic-risc:latest ./api
Expand Down
Loading