Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: persist requirements.txt as a build artifact #284

Merged
merged 11 commits into from
Aug 26, 2022
Merged
7 changes: 6 additions & 1 deletion .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -91,17 +91,22 @@ jobs:
# Generate the requirements.txt that contains the hash digests of the dependencies and
# generate the SBOM using CyclonDX SBOM generator.
make requirements sbom
# Remove the old requirements.txt file (which includes _all_ packages) and generate a
# new one for the package and its actual and required dependencies only.
rm requirements.txt
make prune requirements
# Find the paths to the files that will be included in the release.
TARBALL_PATH=$(find dist -name "*.tar.gz")
WHEEL_PATH=$(find dist -name "*.whl")
REQUIREMENTS_PATH=$(find dist -name "*-requirements.txt")
SBOM_PATH=$(find dist -name "*-sbom.json")
HTML_DOCS_PATH=$(find dist -name *-docs-html.zip)
BUILD_EPOCH_PATH=$(find dist -name *-build-epoch.txt)
# Make sure dist/RELEASE_NOTES.md (which contains the release notes) exists.
touch dist/RELEASE_NOTES.md
NOTES_PATH=$(find dist -name RELEASE_NOTES.md)
# Compute the sha digest for all the release files and encode them using base64.
DIGEST=$(sha256sum $TARBALL_PATH $WHEEL_PATH $SBOM_PATH $HTML_DOCS_PATH $BUILD_EPOCH_PATH $NOTES_PATH | base64 -w0)
DIGEST=$(sha256sum $TARBALL_PATH $WHEEL_PATH $REQUIREMENTS_PATH $SBOM_PATH $HTML_DOCS_PATH $BUILD_EPOCH_PATH $NOTES_PATH | base64 -w0)
echo "Digest of artifacts is $DIGEST."
# Set the computed sha digest as the output of this job.
echo "::set-output name=artifacts-sha256::$DIGEST"
Expand Down
44 changes: 38 additions & 6 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ upgrade: .venv/upgraded-on
.venv/upgraded-on: pyproject.toml
python -m pip install --upgrade pip
python -m pip install --upgrade wheel
python -m pip install --upgrade --upgrade-strategy eager --editable .[hooks,dev,test,docs]
python -m pip install --upgrade --upgrade-strategy eager --editable .[actions,dev,docs,hooks,test]
$(MAKE) upgrade-quiet
force-upgrade:
rm -f .venv/upgraded-on
Expand All @@ -104,14 +104,31 @@ sbom: requirements

# Generate a requirements.txt file containing version and integrity hashes for all
# packages currently installed in the virtual environment. There's no easy way to
# do this, and we have to use yet another external package. For more discussion, see
# https://github.com/pypa/pip/issues/4732
# https://github.com/peterbe/hashin/issues/139
# do this, see also: https://github.com/pypa/pip/issues/4732
#
# If using a private package index, make sure that it implements the JSON API:
# https://warehouse.pypa.io/api-reference/json.html
#
# We also want to make sure that this package itself is added to the requirements.txt
# file, and if possible even with proper hashes.
.PHONY: requirements
requirements: requirements.txt
requirements.txt: pyproject.toml
echo "" > requirements.txt
for pkg in `python -m pip list --format freeze --disable-pip-version-check`; do hashin --verbose $$pkg; done
echo -n "" > requirements.txt
for pkg in `python -m pip freeze --local --disable-pip-version-check --exclude-editable`; do \
echo -n $$pkg >> requirements.txt; \
echo "Fetching package metadata for requirement '$$pkg'"; \
[[ $$pkg =~ (.*)==(.*) ]] && curl -s https://pypi.org/pypi/$${BASH_REMATCH[1]}/$${BASH_REMATCH[2]}/json | python -c "import json, sys; print(''.join(f''' \\\\\n --hash=sha256:{pkg['digests']['sha256']}''' for pkg in json.load(sys.stdin)['urls']));" >> requirements.txt; \
done
echo -e -n "package==$(PACKAGE_VERSION)" >> requirements.txt
if [ -f dist/package-$(PACKAGE_VERSION).tar.gz ]; then \
echo -e -n " \\\\\n `python -m pip hash --algorithm sha256 dist/package-$(PACKAGE_VERSION).tar.gz | grep '^\-\-hash'`" >> requirements.txt; \
fi
if [ -f dist/package-$(PACKAGE_VERSION)-py3-none-any.whl ]; then \
echo -e -n " \\\\\n `python -m pip hash --algorithm sha256 dist/package-$(PACKAGE_VERSION)-py3-none-any.whl | grep '^\-\-hash'`" >> requirements.txt; \
fi
echo "" >> requirements.txt
cp requirements.txt dist/package-$(PACKAGE_VERSION)-requirements.txt

# Run some or all checks over the package code base.
.PHONY: check check-code check-bandit check-flake8 check-lint check-mypy
Expand Down Expand Up @@ -157,6 +174,21 @@ docs: docs/_build/html/index.html
docs/_build/html/index.html: check test
$(MAKE) -C docs/ html

# Prune the packages currently installed in the virtual environment down to the required
# packages only. Pruning works in a roundabout way, where we first generate the wheels for
# all installed packages into the build/wheelhouse/ folder. Next we wipe all packages and
# then reinstall them from the wheels while disabling the PyPI index server. Thus we ensure
# that the same package versions are reinstalled. Use with care!
.PHONY: prune
prune:
mkdir -p build/
jenstroeger marked this conversation as resolved.
Show resolved Hide resolved
python -m pip freeze --local --disable-pip-version-check --exclude-editable > build/prune-requirements.txt
python -m pip wheel --wheel-dir build/wheelhouse/ --requirement build/prune-requirements.txt
python -m pip wheel --wheel-dir build/wheelhouse/ .
python -m pip uninstall --yes --requirement build/prune-requirements.txt
python -m pip install --no-index --find-links=build/wheelhouse/ --editable .
rm -fr build/

# Clean test caches and remove build artifacts.
.PHONY: dist-clean clean
dist-clean:
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ A _shared package_ or library is intended to be imported by another package or a

**Application**: the [`__main__.py`](https://docs.python.org/3/library/__main__.html#main-py-in-python-packages) file ensures an entry point to run this package as a standalone application using Python’s [-m](https://docs.python.org/3/using/cmdline.html#cmdoption-m) command-line option. A wrapper script named `something` is also generated as an [entry point into this package](https://flit.pypa.io/en/latest/pyproject_toml.html#scripts-section) by `make setup` or `make upgrade`. In addition to specifying directly dependent packages and their version ranges in `pyproject.toml`, an application should _pin_ its entire environment using the [`requirements.txt`](https://pip.pypa.io/en/latest/user_guide/#requirements-files). Use the `make requirements` command to generate that file if you’re building an application.

The generated `requirements.txt` file with its integrity hash for every dependent package is used to generate [SBOM](https://www.cisa.gov/sbom) in [CycloneDX format](https://cyclonedx.org/). This is an important provenance material to provide transparency in the packaging process (see also [SBOM + SLSA](https://slsa.dev/blog/2022/05/slsa-sbom)).
The generated `requirements.txt` file with its integrity hash for every dependent package is used to generate a [Software Bill of Materials (SBOM)](https://www.cisa.gov/sbom) in [CycloneDX format](https://cyclonedx.org/). This is an important provenance material to provide transparency in the packaging process (see also [SBOM + SLSA](https://slsa.dev/blog/2022/05/slsa-sbom)). That `requirements.txt` file, in addition to the SBOM, is also stored as a build artifact for every package release.

## How to use this repository

Expand Down
10 changes: 7 additions & 3 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -34,14 +34,18 @@ something = "package.__main__:main"
[project.entry-points]

[project.optional-dependencies]
# The 'actions' requirements match exactly the packages installed by the workflows.
# We keep them listed here to ensure the infrastructure BOM is consistent with what's
# installed. Make sure to keep the requirements in sync with the workflows!
actions = [
"commitizen ==2.32.1",
"twine ==4.0.1",
]
dev = [
"flit >=3.2.0,<4.0.0",
"hashin ==0.17.0",
"mypy >=0.921,<=0.971",
"pylint >=2.9.3,<=2.14.5",
"commitizen >=2.28.0,<3.0.0",
"cyclonedx-bom >=3.5.0,<4.0.0",
"twine >=4.0.1,<5.0.0",
]
docs = [
"sphinx >=5.1.1,<6.0.0",
Expand Down