-
Notifications
You must be signed in to change notification settings - Fork 0
Initial implementation #1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
This approach is already used by the existing Python metadata caching to reduce the descriptor size.
We can easily determine the RECORD path from the dist-info directory, so returning the directory specifically is more straightforward.
Module-based invocation works well when the subprocess interpreter has the same Python PATH and modules as the main process, but this isn't always the case (i.e. pytest). It's safer to import the module we want to call in the main process and directly invoke that script in the subprocess. One behavioral difference between the module and script is that scripts resolve import starting with the script's directory. To avoid collisions, this change moves the scripts into a new subdirectory.
Thanks for giving this a shot, @emanuelbuholzer. That error happens with the version of |
Thanks @cottsay for the feedback and your work on this. I deleted my original comment, because the tests ran all fine within my CI pipeline and I thought I was maybe a bit too eager. I'm creating an example workspace with some example packages to test some use cases with it further. Hopefully I'll have some more feedback soon. |
|
||
# check if the package index and manifest are being installed | ||
data_files = get_data_files_mapping( | ||
setup_py_data.get('data_files', []) or []) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder if we need an alternative approach of getting the package index and manifest, or otherwise not warn about the fallback and it being removed in the future.
If you migrate away from the legacy setup.py
build you most likely don't want to have a setup.py
in your package. My testing showed that this is not possible. From what I understand, this is to keep compatibility with legacy builds or versions of tools that don’t support certain packaging standards (see setuptools documentation). If this is the case, I think it'd be great to follow the documentation and to try to keep it a simple setup()
call.
With this current implementation however the package index and manifest can only be read if you have a setup.py file with its data_files=[('share/ament_index/resource_index/packages', ['resource/' + package_name]), ('share/' + package_name, ['package.xml'])]
mapping. Otherwise you'll be prompted with the warnings about the fallback and it being removed in the future. I'm unsure what's the best way to solve this and would love to hear your thoughts about this. From what I understand, the traditional data files support is deprecated (see setuptools documentation).
Partially because colcon dependencies don't carry context around what package type the dependency originated from, it's difficult for the python testing step to reliably determine the test dependencies using abstract mechanisms. For the time being, the easiest thing to do to enable those plugins for packages discovered by standards-based extensions is to mimic the setuptools metadata format for dependencies.
Having multiple paths to calling a specific hook has immediately backfired after a previous change started using 'call_hook' instead of the individual partial method, which routed around the setuptools decorator. It was a nice idea, but isn't really needed.
This extension seems to modify the global logging config more than other extensions do, and the result is very verbose output from the flake8 tests. Suppressing pydocstyle debug messages seems to maket things reasonable again.
High number == run sooner, low number == run later
Also add the unfortunate note about symlink installs and data_files.
Interested in testing/reviewing this. Is the intention that I would follow the instructions here: https://github.com/colcon/colcon-python-project/tree/devel , create a ROS 2 package, delete the setup.py and replace it with pyproject.toml, and see if it will build properly? |
No description provided.