A python app that takes care of your accountancy.
I Do Accountancy uses Python 3.13.
Optionally Docker can be used for deployments or consistent local development.
Tip: The recommended IDE is VSCode. A .vscode directory is provided with a file containing recommended extensions alongside default launch configurations and workspace specific settings.
The builtin module venv is used to manage virtual environments, pip-tools to manage dependencies and setuptools to build the actual project. A compose file alongside a Dockerfile are also included.
python -m venv .venv/
source .venv/bin/activate
.\.venv\Scripts\activate
python -m pip install --upgrade pip pip-tools
pip-sync is used to ensure only the required dependencies are installed. Without it, if any packages were installed before, they would remain in the environment.
pip install is used to ensure scripts are installed and available.
For development, the --editable flag is used in order to reflect changes made to the source code immediately.
pip-sync requirements/dev.txt && \
pip install --editable .[dev]
pip-sync requirements/main.txt && \
pip install .
Formatting should likely be done as a last step as some auto-fixed linting issues can result in wrongly formatted code.
ruff format src/
ruff check --fix src/
pylint src/
Note: && \ to combine the command is not used because pylint should be ran regardless of the exit_code of ruff
pyright src/
manage test apps
coverage run -m manage test src/apps
coverage report -m
ruff check --fix src/
pylint src/
ruff format src/
pyright src/
docker compose up -d --build --force-recreate
docker compose logs -f
docker compose exec ida /bin/sh
docker compose down -rmi all -v
docker run --rm -it -v VOLNAME:/data alpine /bin/sh
manage makemigrations
manage migrate
APPNAME="myapp" manage migrate $APPNAME 000N
This can be useful when you created many migrations during development, where some undo the previous and redo them. When done with development, you could just revert to the last migration on main, remove the new migrations on your branch and run makemigrations again.
manage collectstatic
manage createsuperuser
The following commands should be ran from the apps directory to avoid django attempting to create translations for non django apps.
cd src/apps
Initial command (include all locales you want to have generated)
manage makemessages --locale de --locale fr --locale nl
Later you probably just want to use --all and remove obsolete entries
manage makemessages --all --no-obsolete
The following command should be ran on fresh installs as the MO files are not included in version control.
manage compilemessages
APPNAME="myapp" bash -c 'mkdir -p src/apps/$APPNAME && manage startapp $APPNAME src/apps/$APPNAME'
$env:APPNAME="myapp"; $ErrorActionPreference="Stop"; mkdir src\apps\$env:APPNAME; manage startapp $env:APPNAME src\apps\$env:APPNAME
This could be useful to generate test-data
APPNAME="myapp" bash -c 'mkdir -p src/apps/$APPNAME/fixtures && manage dumpdata $APPNAME >> src/apps/$APPNAME/fixtures/$APPNAME.json'
Note: This appends to the file and thus, could result in duplicate pks. Older records are overwritten with newer records.
openssl rand -hex 40
Add-Type -AssemblyName System.Web
[System.Web.Security.Membership]::GeneratePassword(81,0)
manage runserver 0.0.0.0:38080
manage setwebhook
Core dependencies should be added to requirements/main.in and development dependencies to requirements/dev.in.
Once this is done, pip-compile is used to generate the effective requirements files.
pip-compile --extra dev -o requirements/dev.txt pyproject.toml
pip-compile -o requirements/main.txt pyproject.toml
docker compose -f compose.yml -f compose.dev.yml up -d --build
docker compose up -d --build
A compose file and Dockerfile are provided to deploy consistently. The files can be included in central compose files using the include directive in the central file. This is useful when deploying multiple services on a single server behind an nginx proxy. The central file would define the nginx config and just include this service.
Q: After forking the repository, the workflows fail with the error ida.environ.MissingEnvironmentVariableError: Environment variable DJANGO_SECRET_KEY is not set..
A: This is because IDA requires a secret to be set on the repository and secrets are not passed along to forks. Refer to this URL to learn how to create secrets for your repository.