This project uses Docker. The easiest way to get started is to install Docker Desktop which provides the docker
and docker-compose
commands.
After installing Docker, use the dev-setup script to run the project locally:
./scripts/dev-setup
This command will create an .env file with unique keys, build docker images and containers, run database migrations, and load fixture data.
It's worth copying settings/local.py.example
to settings/local.py
but leaving everything commented out for now. Updating local.py
can be a handy way to re-enable production-like settings (eg S3 uploads) that are disabled via the default local-development settings.
With Docker Desktop running in the background, bring up the services by running:
docker-compose up
As you make changes, remember to run the tests…:
docker-compose exec app python manage.py test
docker-compose exec static npm run test
…and to make and apply database migrations:
docker-compose exec app python manage.py makemigrations
docker-compose exec app python manage.py migrate
Note that the local build goes via nginx so that we can have local-dev HTTPS for greater parity with production.
You can find the local site under HTTPS at https://developer-portal-127-0-0-1.nip.io
. http://localhost:8000
will also still respond, but be aware that any behaviour which requires HTTPS (eg: CSP) may cause problems.
Feel free to add 127.0.0.1 developer-portal-127-0-0-1.nip.io
to your /etc/hosts
if you want to work totally offline.
The project has support for using Therapist to run pre-commit linting and formatting tools. You don't have to use it, but it makes it life easier.
therapist
is configured to run:
- black for code formatting
- flake8 for syntax checking
- isort for import-order management
- eslint for JavaScript checking
- prettier for JavaScript formatting
At the moment, this project assumes have all of the above installed and available on the $PATH
of your host machine, along with Python 3 and pip
. You can install the dependencies using virtualenv
and nvm
if you want, or not, or globally. As long as they're available, you're good.
(Note: this project is not currently supporting use of therapist
inside Docker containers.)
TIP: It's wise to enable all of the above tooling in your code editor, if possible, so that things are already in order before the pre-commit hook is run.
Install therapist
:
$ pip install therapist
Install the pre-commit hook that will trigger therapist
automatically:
$ therapist install
Installing pre-commit hook... DONE
(Take a look at the pre-commit file added to .git/hooks/
to confirm the path looks right.)
Now, when you git-commit
a change, the staged changes will be checked by one or more of black
, isort
, flake8
, eslint
and/or prettier
. See .therapist.yml
in the project root for the configuration.
Alternatively, if you wanted to run it across the whole codebase, run:
$ therapist run developerportal/
And if, for some reason, you want therapist
to auto-fix everything wrong using those tools, run:
$ therapist run developerportal/ --fix
Finally, therapist
can be passed a list of file paths if you want to just run it on specific things:
$ therapist run developerportal/path/to/file.js developerportal/path/to/another_file.py
Mozilla SSO via OpenID Connect is the default for admin login.
To use this locally, you will need to have a Mozilla SSO account, plus values for OIDC_RP_CLIENT_ID
and OIDC_RP_CLIENT_SECRET
in your .env
or in settings.local
. You can local-development versions of these credentials from a another MDN Developer Portal team member, someone on Mozilla IAM team or the project's SRE (@limed).
If you have a Mozilla SSO account, create a Django superuser with the same email address.
docker-compose exec app python manage.py createsuperuser
If you do not have a Mozilla SSO account, or you want to work offline, you can create a Django superuser and configure the local build to use conventional Django auth. See settings/local.py.example
After pulling master you may need to install new dependencies…:
docker-compose build
…or run database migrations:
docker-compose exec app python manage.py migrate
…and/or update the wagtailsearch index:
docker-compose exec app python manage.py wagtail_update_index
If things get messed up, you could (as a last resort) prune ALL Docker images, containers and volumes, and start from scratch:
./setup.sh --prune
The preferred way to do things is to have a local-dev-only AWS bucket configured as per settings/local.py.example
so that user media goes up to S3, but it is possible to use local storage instead.