-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Build dependencies are not downloaded by pip download
#7863
Comments
Thanks for a super detailed bug report @qwhelan!
I'm personally not sure about how pip download should handle build dependencies, so I'll think a bit more about this before coming back. |
@pradyunsg I think downloading the build dependencies of anything that uses PEP 518 and isn't satisfied by a wheel would be a reasonable choice. If the main idea is that (Obviously this won't work for projects that don't use PEP 518, but that seems like a good incentive for those projects to opt in.) |
@uranusjr I tried to fiddle with pip a bit, and saw that when downloading python3 src/pip download numpy --no-binary numpy -d /tmp
Collecting numpy
Downloading numpy-1.18.2.zip (5.4 MB)
|████████████████████████████████| 5.4 MB 466 kB/s
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing wheel metadata ... done
Saved /tmp/numpy-1.18.2.zip
Successfully downloaded numpy Seems to me like we install build deps and build wheel requirements even though we should instead download them, and not install them, or am I just missing something... |
@pradyunsg I tried to look at the relevant code and ultimately failed :( |
@NoahGorny The issue here is that there's 2 kinds of dependencies: build-time dependencies and run-time dependencies. That changed with PEP 518 and PEP 517, which added the ability to specify build-time dependencies, and a very compelling reason to use them. However, the However, this tracking could be very difficult given how pip's implementation works. pip's current approach for implementing the build isolation logic is dependent on (recursive) subprocesses [see req_tracker, build_env and stuff that they interact with] with no real inter-process-communication. I'm personally not able to think of a "quick to do and an not-ridiculously-difficult-to-implement-and-maintain" approach exists for actually implementing the tracking-of-build-dependencies for pip download. FWIW, once we get a certain amount of cleanup (i.e. stop spawning a subprocess and do the installations in the build environment in-process w/ a stack, and all the supporting refactoring), it should be possible to do so relatively easily. This is however a non-trivial task though and certainly not "quick to do". :) |
pip download
pip download
pip download
pip download
The reactive framework was born in a time of the now legacy charm store. The legacy charm store did not have awareness for binary compatibility with various CPU architectures. In order to enable the use of Python virtual environments, and at the same time support multiple CPU architectures, the charm-tools build action currently uses `pip download` to retrieve the source code of Python dependencies. On charm install layer-basic will subsequently use pip to build and install the dependencies on the target. Over the recent years this approach has become increasingly difficult to maintain. The Python ecosystem has a plethora of build automation and package mangement projects, and each and every Python dependency makes their own choices about what to use. Adding to that, pip does not really support automatic discovery of build dependencies on download (ref pypa/pip#7863). Today the legacy charm store has been replaced by charmhub, which does have awareness of CPU architectures. The Launchpad build automation service has also gained support for charm recipes, which allow developers to build their charms accross a broad range of CPU architectures. To leverage these capabilities and relieve reactive charm maintainers of the duty of manually hunting down and compiling requirement files for build dependencies, this patch adds support for building charms using binary wheels. Signed-off-by: Frode Nordahl <frode.nordahl@canonical.com>
The reactive framework was born in a time of the now legacy charm store. The legacy charm store did not have awareness for binary compatibility with various CPU architectures. In order to enable the use of Python virtual environments, and at the same time support multiple CPU architectures, the charm-tools build action currently uses `pip download` to retrieve the source code of Python dependencies. On charm install layer-basic will subsequently use pip to build and install the dependencies on the target. Over the recent years this approach has become increasingly difficult to maintain. The Python ecosystem has a plethora of build automation and package mangement projects, and each and every Python dependency makes their own choices about what to use. Adding to that, pip does not really support automatic discovery of build dependencies on download (ref pypa/pip#7863). Today the legacy charm store has been replaced by charmhub, which does have awareness of CPU architectures. The Launchpad build automation service has also gained support for charm recipes, which allow developers to build their charms accross a broad range of CPU architectures. To leverage these capabilities and relieve reactive charm maintainers of the duty of manually hunting down and compiling requirement files for build dependencies, this patch adds support for building charms using binary wheels. Signed-off-by: Frode Nordahl <frode.nordahl@canonical.com>
The reactive framework was born in a time of the now legacy charm store. The legacy charm store did not have awareness for binary compatibility with various CPU architectures. In order to enable the use of Python virtual environments, and at the same time support multiple CPU architectures, the charm-tools build action currently uses `pip download` to retrieve the source code of Python dependencies. On charm install layer-basic will subsequently use pip to build and install the dependencies on the target. Over the recent years this approach has become increasingly difficult to maintain. The Python ecosystem has a plethora of build automation and package mangement projects, and each and every Python dependency makes their own choices about what to use. Adding to that, pip does not really support automatic discovery of build dependencies on download (ref pypa/pip#7863). Today the legacy charm store has been replaced by charmhub, which does have awareness of CPU architectures. The Launchpad build automation service has also gained support for charm recipes, which allow developers to build their charms accross a broad range of CPU architectures. To leverage these capabilities and relieve reactive charm maintainers of the duty of manually hunting down and compiling requirement files for build dependencies, this patch adds support for building charms using binary wheels. Allowing the user to specify that wheels should be built from source could be useful for detecting any missing build environment binary dependencies (C/Rust library packages etc). Signed-off-by: Frode Nordahl <frode.nordahl@canonical.com>
Hello! I want to share another usecase for this. The more packages now migrate to isolated builds the harder it becomes to maintain Flatpaks. |
pip download
pip download
Just ran into this problem myself. Might it be possible to have some way of running @pradyunsg, since you said the recursive calls of the build isolation logic subprocesses might be tricky to finesse, I'm thinking that this might instead just copy the process that
(and maybe the This wouldn't reproduce all the logic—just like requirements files don't reproduce logic to my knowledge— but maybe it's sufficient for most simple sets of build dependencies? I'm not sure exactly how pip implements |
Hi! Is there an update on the state of the rewrites required to fix This issue and #1884 are severely impacting Flatpak builds for Python apps. With the fact that now even some "core" Python packages have switched to using build dependencies and there not being any solution at the moment, this is getting worse by the day sadly. |
Sadly, no. If there was one, it'd be reported here. |
Environment
Description
An issue was reported in my repo pydata/bottleneck#333 by a user utilizing
pip download
to create a local cache prior to installation, but was seeing the following error despite having an up-to-date copy ofsetuptools
:I was able to reproduce the behavior via a
Dockerfile
and identify that the root issue is thatpip download
is not fetching the PEP 517 dependencies. This error message appears to be reproducible for other packages (such asnumpy
) provided thatpip download
fetches a source package that requires a PEP 517 build step.My findings can be found here: pydata/bottleneck#333 (comment)
Expected behavior
As far as expected behavior, I would expect
pip download
to fetch all packages needed to successfully install the target package(s), including build-only dependencies.How to Reproduce
Dockerfile
:Output:
The text was updated successfully, but these errors were encountered: