Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generate all requirements files only using pyproject.toml as input #1957

Closed
aaronsgithub opened this issue Aug 8, 2023 · 12 comments
Closed

Comments

@aaronsgithub
Copy link

What's the problem this feature will solve?

Perhaps this is already possible with the existing functionality of pip-tools and I need to go to timeout.
Or if there are workarounds which would allow me to get close to the situation described below, I would be grateful if someone could share details with me.

I want to use pyproject.toml as the sole configuration file in a python project, and use pip-compile to generate all the necessary pinned requirements files with unpinned dependencies specified in pyproject.toml instead of requirements.in.

For argument sake, lets say those requirements files we need are:

  • requirements.txt all requirements needed to run the application
  • requirements-test.txt contains only dependencies needed for integration tests
  • requirements-dev.txt superset of the both above files, with additional development dependencies.

Then pip install should be able to infer from the pyproject.toml file that it has use:

  • requirements.txt if pip install .
  • requirements-dev.txt if pip install .[dev]
  • requirements-test.txt if pip install .[test]

Here is an example pyproject.toml with hypothetical tool.pip-compile sections:

[build-system]
requires = ["setuptools>=40.8.0", "wheel"]
build-backend = "setuptools.build_meta"

[project]
requires-python = ">=3.10"
name = "myproject"
version = "0.1.0dev1"
dynamic = ["dependencies", "optional-dependencies"]

[tool.pip-compile]
dependencies = [
    "alembic",
    "fastapi",
    "pydantic[email]",
    "sqlalchemy",
]

[tool.pip-compile.optional-dependencies]
dev = [
        "black",
        "pip-tools",
]
test = [
        "pytest",
        "pytest-playwright",
]   

[tool.setuptools.dynamic]
dependencies = { file = ["./requirements/requirements.txt"] }
optional-dependencies.dev = { file = ["./requirements/requirements-dev.txt"] }
optional-dependencies.test = { file = ["./requirements/requirements-test.txt"] }

And here is how the requirements files could be generated using pip-compile:

    pip-compile -o requirements/requirements.txt; # includes base deps only
    pip-compile --deps=test -o requirements/requirements-test.txt; # includes test deps only
    pip-compile --extra=dev --extra=test -o requirements/requirements-dev.txt; # includes base + test + dev deps

So the hypothetical --deps flag here would generate a requirements/requirements-test.txt file using only the dependencies listed in tool.pip-compile.optional-dependencies.

[tool.pip-compile.optional-dependencies]
test = [...] 

Can anything equivalent be achieved with existing functionality?

Describe the solution you'd like

The following to be possible:

  • A way to have requirements/requirements-test.txt only contain the the dependencies listed under tool.pip-compile.optional-dependencies.test
  • pip install knows to install via the requirements files generated by pip-compile
  • pyproject.toml is the only configuration file in the project and act as the single source of truth.

Alternative Solutions

This can be achieved with separate configuration files but the goal is to have pyproject.toml be a single source of truth and satisfy all needs :)

Additional context

That's all folks.

@aaronsgithub
Copy link
Author

One of the issues I'm hoping to resolve with the above is, if we follow the setup described here:
https://github.com/jazzband/pip-tools#requirements-from-pyprojecttoml

Then, pip install . would install from requirements.in instead of the requirements.txt generated by pip-compile.

@atugushev
Copy link
Member

atugushev commented Aug 8, 2023

Currently, pip-tools can't compile only extra dependencies. This would require a new option which I'd call --only-extra, for example:

# includes test deps only
pip-compile --only-extra=test -c requirements.txt -o test-requirements.txt

Note -c requirements.txt so that common sub-dependencies are in sync with requirements.txt.

Feel free to submit a PR I'd gladly review and merge.

@aaronsgithub
Copy link
Author

Thank you for the quick reply.

I hope to work on that pull request.

What are your thoughts on specifying inputs to pip-compile in pyproject.toml?

I cannot find a way we can do this currently where you can specify inputs to pip-compile inside pyproject.toml to generate a requirements.txt whilst also pointing pip / setuptools to the resulting requirements.txt file.

The docs I linked to before have pip installing against requirements.in instead of the resulting requirements.txt.

That is what I was hoping to avoid with the lines:

[project]
...
dynamic = ["dependencies", "optional-dependencies"]

[tool.setuptools.dynamic]
dependencies = { file = ["./requirements/requirements.txt"] }
optional-dependencies.dev = { file = ["./requirements/requirements-dev.txt"] }
optional-dependencies.test = { file = ["./requirements/requirements-test.txt"] }

But then, pip-compile has no input to generate the requirements.txt files.

Should there be some way to inline requirements.in under [tool.pip-compile]?

@atugushev
Copy link
Member

atugushev commented Aug 8, 2023

As a workaround, you can statically declare dependencies in pyproject.toml file, and use pip install -e . -c requirements.txt to apply constraints. See the example below.

Example

Create pyproject.toml

[project]
name = "foo"
version = "0.1"
dependencies = ["django"]

[project.optional-dependencies]
dev = ["black"]
test = ["pytest"]

[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"

Compile requirements.txt

$ pip-compile
#
# This file is autogenerated by pip-compile with Python 3.11
# by the following command:
#
#    pip-compile
#
asgiref==3.7.2
    # via django
django==4.2.4
    # via foo (pyproject.toml)
sqlparse==0.4.4
    # via django

Compile dev-requirements.txt:

$ pip-compile --extra dev -c requirements.txt -o dev-requirements.txt
#
# This file is autogenerated by pip-compile with Python 3.11
# by the following command:
#
#    pip-compile --constraint=requirements.txt --extra=dev --output-file=dev-requirements.txt
#
asgiref==3.7.2
    # via
    #   -c requirements.txt
    #   django
black==23.7.0
    # via foo (pyproject.toml)
click==8.1.6
    # via black
django==4.2.4
    # via
    #   -c requirements.txt
    #   foo (pyproject.toml)
mypy-extensions==1.0.0
    # via black
packaging==23.1
    # via black
pathspec==0.11.2
    # via black
platformdirs==3.10.0
    # via black
sqlparse==0.4.4
    # via
    #   -c requirements.txt
    #   django

Installation:

# install unpinned requirements
$ pip install -e .

# install pinned requirements
$ pip install -e . -c requirements.txt

# install unpinned dev requirements
$ pip install -e .[dev]

# install pinned dev requirements
$ pip install -e .[dev] -c dev-requirements.txt

@aaronsgithub
Copy link
Author

Thanks for the example.

My only bugbear is the extra arguments required to pip install.

Part of the motivation is to have pyproject.toml be a single source of truth whilst allowing pip install myproject to "just work" without passing any additional commandline args.

In order for that to happen, it seems there would need to be a way of decoupling the actual dependencies in pyproject.toml from the inputs to pip-compile via [tool.pip-tools].

I was suggesting that one way of implementing this could be to have:

[tool.pip-tools]
dependencies = [...]

[tool.optional-dependencies]
dev = [...]

as overrides to:

[project]
dependencies = [...]

[project.optional-dependencies]
dev = [...]

I'd love to hear what you and the others think of this particular aspect?

Everything is configured in a single file, and pip install would just work.

@AndydeCleyre
Copy link
Contributor

I'm sorry, I think I am missing some understanding of all your goals here, but thought I'd offer another workaround/flow for my best understanding of the ask:

pyproject.toml:

[build-system]
requires = ["flit_core >=3.2,<4"]
build-backend = "flit_core.buildapi"

[project]
name = "myproject"
authors = [{name = "Andy", email = "andy@example.com"}]
license = {file = "LICENSE"}
classifiers = ["License :: OSI Approved :: MIT License"]
dynamic = ["version", "description"]
dependencies = ["alembic", "fastapi", "pydantic[email]", "sqlalchemy"]

[project.urls]
Home = "https://github.com/andydecleyre/myproject"

[project.optional-dependencies]
dev = ["black", "flit", "nox", "tomli"]
test = ["pytest", "pytest-playwright"]

[tool.pip-tools]
upgrade = true
header = false
annotation-style = "line"
strip-extras = true
allow-unsafe = true

noxfile.py:

"""Tasks using Python environments."""

from pathlib import Path

import nox
import tomli

nox.options.default_venv_backend = 'venv'
nox.options.reuse_existing_virtualenvs = True


@nox.session(python='3.10')
def lock(session):
    """Generate updated lock files from pyproject.toml."""

    metadata = tomli.loads(Path('pyproject.toml').read_text())

    tempfiles = {
        Path('requirements.in'): '\n'.join(metadata['project']['dependencies']),
        Path('requirements-test.in'): '\n'.join(
            metadata['project']['optional-dependencies']['test'] +
            ['-c requirements.txt']
        ),
        Path('requirements-dev.in'): '\n'.join(
            metadata['project']['optional-dependencies']['dev'] +
            ['-r requirements.txt', '-r requirements-test.txt']
        )
    }

    session.install('-U', 'pip-tools', 'pip')

    Path('requirements').mkdir(exist_ok=True)
    with session.chdir('requirements'):
        for in_file, content in tempfiles.items():
            in_file.write_text(content)
            session.run('pip-compile', '--config', '../pyproject.toml', in_file)
        for in_file in tempfiles:
            in_file.unlink()

Generate lock files:

$ pip install -e '.[dev]'
$ nox -s lock

@aaronsgithub
Copy link
Author

aaronsgithub commented Aug 9, 2023

Thanks for the flit example.

I'm probably not doing a good job of explaining so I'll take a step back and try again.

  1. As part of the specification, pyproject.toml has a prescribed way of specifying project dependencies via dependencies and optional-dependencies. But we also have the option of specifying these as dynamic values so that their values may be provided by the output of another tool, in this case pip-compile.

  2. pip-compile should be generating the dynamic value of them for other build tools to use. The issue is that there is no way to specify the input to pip-compile inside a pyproject.toml file. For comparison, here is how this is possible in poetry. You could achieve this outside pyproject.toml by doing something like:
    pip-compile -o requirements.txt requirements.in
    and then the build tools would be pointed to the output:

[project]
...
dynamic = ["dependencies", "optional-dependencies"]

[tool.setuptools.dynamic]
dependencies = { file = ["requirements.txt"] }
  1. The goal is that there shouldn't be any need for additional scripts or config files (e.g. requirements.in). Everything should be configured in pyproject.toml. There also shouldn't be any need for additional command line flags. pip install and pip install -e should just work based off the pyproject.toml. One way I could imagine this being achieved, is to essentially allow the dependencies to be specified under a [tool.pip-tools] heading:
[project]
...
dynamic = ["dependencies", "optional-dependencies"]

[tools.pip-compile] # or [tools.pip-tools]
dependencies = { file = ["requirements.in"] } 
# or
dependencies = [...] # for inlining

[tool.setuptools.dynamic]
# the docs https://github.com/jazzband/pip-tools#requirements-from-pyprojecttoml use 
# requirements.in here which is not what we want:
# dependencies = { file = ["requirements.in"] } <- we don't want this
dependencies = { file = ["requirements.txt"] } # instead we want the output of pip-compile

@AndydeCleyre
Copy link
Contributor

Usually folks don't want the lockfile to be the same content as the project's declared dependencies, because it would be way too restrictive for general installation.

@aaronsgithub
Copy link
Author

Definitely true for libraries, but for applications I would want the lockfile to be used for installation. But that's something I can control within the scope of a project and its README.md so I guess this issue can be closed 🙂

@aaronsgithub aaronsgithub reopened this Aug 10, 2023
@aaronsgithub aaronsgithub closed this as not planned Won't fix, can't repro, duplicate, stale Aug 10, 2023
@webknjaz
Copy link
Member

@aaronsgithub it's not related to the scope of this project, it's because there's no standard for what you're asking. Specifying library deps is standardized. For apps, you need to specify the environment deps which is essentially a collection of coordinated pinned packages installed into that environment. It just so happens to correspond to the app deps. Though, such sets of deps might be slightly different — test/dev deps would often include the app ones, the linter deps might not need all the app deps. The test deps might be different per environment. Also the build deps are separate from the runtime ones.
Each of those could benefit from a constraints/lock file. I know that some people want a unified lock file while others understand that different environments might have conflicting constraints and unrelated environments May negatively impact the lockfiles of the target.
This is all non-standardized but I wouldn't say that the app constraints are unsupported. It's just that there's no standard for describing them within pyproject.toml (and honestly, I think that dumping so many semantically different tool and env configs into a single file had a ton of disadvantages).
I prefer simpler in+txt file pairs that have dedicated semantic meaning. Of course, they should be in a dedicated subfolder, like requirements/ so they don't pollute project roots.

That said, it might be possible to agree on having a pip-tools specific section for the purpose of locking environments but it'd have to be well-thought first.

Ideally, such things should go through a standardization process so that there's some interoperability possible across different tools.

@aaronsgithub
Copy link
Author

I prefer simpler in+txt file pairs that have dedicated semantic meaning. Of course, they should be in a dedicated subfolder, like requirements/ so they don't pollute project roots.

I actually like this too, and this is my current setup.

However, I would say motivation in seeking the ability to do everything within pyproject.toml stems from the learning curve python packaging and project configuration has for beginners, and in certain cases it simplifies things if you can point collaborators to a single file.

That said, it might be possible to agree on having a pip-tools specific section for the purpose of locking environments but it'd have to be well-thought first.

Yep, definitely requires more thought than I've provided 😅.

What I would say is that it would be nice to have a way of declaratively specifying the existing cli functionality under [tools.pip-tools], with the possibility of inlining SRC_FILES as an array.

@AndydeCleyre
Copy link
Contributor

What I would say is that it would be nice to have a way of declaratively specifying the existing cli functionality under [tools.pip-tools], with the possibility of inlining SRC_FILES as an array.

You might be able to achieve something very close to what you're after with an additional tool, taskipy.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants