Please see the Python integration docs for details.
We recommend using uv. It's super fast.
- Run
uv venv env
(creates virtual environment called "env")- or
python3 -m venv env
- or
- Run
source env/bin/activate
(activates the virtual environment) - Run
uv sync --extra dev --extra test
(installs the package in develop mode, along with test dependencies)- or
pip install -e ".[dev,test]"
- or
- you have to run
pre-commit install
to have auto linting pre commit - Run
make test
- To run a specific test do
pytest -k test_no_api_key
uv python install 3.9.19
uv python pin 3.9.19
uv venv env
source env/bin/activate
uv sync --extra dev --extra test
pre-commit install
make test
Assuming you have a local version of PostHog running, you can run python3 example.py
to see the library in action.
Updates are released automatically using GitHub Actions when version.py
is updated on master
. After bumping version.py
in master
and adding to CHANGELOG.md
, the release workflow will automatically trigger and deploy the new version.
If you need to check the latest runs or manually trigger a release, you can go to our release workflow's page and dispatch it manually, using workflow from master
.
You can run make prep_local
, and it'll create a new folder alongside the SDK repo one called posthog-python-local
, which you can then import into the posthog project by changing pyproject.toml to look like this:
dependencies = [
...
"posthoganalytics" #NOTE: no version number
...
]
...
[tools.uv.sources]
posthoganalytics = { path = "../posthog-python-local" }
This'll let you build and test SDK changes fully locally, incorporating them into your local posthog app stack. It mainly takes care of the posthog -> posthoganalytics
module renaming. You'll need to re-run make prep_local
each time you make a change, and re-run uv sync --active
in the posthog app project.