-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[META] Decouple interface generation pipeline from CMake #560
Comments
I'll be putting together a concrete design proposal for a change in the coming weeks. Early input on specific use cases that folks think warrant support via pipeline generalization is most welcomed. Perhaps @EricCousineau-TRI, @IanTheEngineer, @jacobperron, @koonpeng, or @gbiggs may have some. |
Thanks for the ping! One feature that is missing in the current pipeline is the ability to retroactively generate interfaces for a new language. For example, in order to generate support for rcljava (and perhaps other client libraries like rclrs), we need to compile interface packages (e.g. Ultimately, it would be nice if a user compiling |
Thanks for filing this! I think we should be sure to monitor performance at the forefront - I'd assume for primarily Tier I platforms. The main concern is that having CMake do a whole bunch of subprocess dispatch may slow things down a bit, esp. on Windows (DISCLAIMER: I currently have zero stake in Windows use cases, but just wanna make sure we don't disservice peeps). Are there already existing benchmarks like that? (e.g. recording metrics from ament about configuration time in CMake?) And dumb question: Can we assume that any system building ROS 2 components will have access to Python? (I think Python's a prereq, but just wanna make sure!) |
Python is currently listed as a dependency in REP 2000, though there are efforts to define a "minimal" C++-only installation (see ros-infrastructure/rep#231). So, I suppose Python is not (or in the future may not be) a strict dependency for a system running ROS. But, I think Python is a reasonable requirement for the purposes of generating and building ROS messages. It would basically require an entire rewrite of the interface generation pipeline if we dropped Python (for better or worse). |
Yeah, I agree; requiring Python on the build machine seems fine. We just want to make sure it isn't required on the target (unless you are using |
This is my experience when working on a
The list of all packages can be easily gotten from the ament index, but "2" and "3" are more complicated. The only way I know of to parse the message definitons is using It would also be nice if a 3rd party idl generator can easily make reliable incremental builds, we will need to have a way that allows arbitrary build systems to detect changes and selectively rebuild only the messages that have changed. After further testing, I realized that this may affect more than just build systems. There are multiple places in the ros ecosystem that assumes that a message's library can be found in https://github.com/koonpeng/ros_msg_with_diff_pkg_name This is an example message that exports it's library in
This also fails with rclpy and even rclcpp. I haven't been able to test this with ros2bag but I think it would likely fail as well. Given that none of the ros tools work with this message, maybe it was never intended for such a message to be "valid", in this case I would suggest adding a check in the cmake macro/functions to prevent such message from being built. |
This issue has been mentioned on ROS Discourse. There might be relevant details there: https://discourse.ros.org/t/common-messages-github-organization/18028/14 |
Thank you all for the feedback. I'm still trying to put all the pieces together (this one's really tough 😅). In what follows, I'll reply while I layout my current train of thought. As I understand it, ROS software is organized in packages. A package contains source code and data files, along with a build system (or build system generator) to turn that into usable artifacts (in the most general sense e.g. an executable binary, a shell script, a dynamic library, a Python module, a Java JAR file, etc.). Packages are the smallest distributable unit. Each package is defined in one place and built in full. Packages are usually built in groups using workspaces. There can only be one version or variation of a package in a workspace. However, workspaces can be overlayed such that a version or variation of a given package in the overlay takes precedence over a different version or variation of the same package in the underlay. Our I intend to stick to this organization model.
@jacobperron I don't think this feature belongs to the pipeline. Interfaces in core packages that are distributed in binary underlays are already built, and I think pulling package sources into an overlay is the simplest, cleanest, and most explicit solution. Perhaps what we need here is
@EricCousineau-TRI AFAIK no, there are no configuration-time nor build-time benchmarks.
@EricCousineau-TRI As mentioned above, Python is a prerequisite according to REP-2000 and the current pipeline relies on it significantly. That is not a justification considering that we're trying to decouple this from
@koonpeng I don't agree. A By your description of I can think of two solutions that stay within the model. One is to re-build with a new generator. Automating custom overlay generation can definitely help. The other solution is to go full dynamic and generate the bindings on the fly, in runtime, in process memory space. That's assuming you can go without actually building anything, loading dynamic libraries and working with opaque types and binary blobs.
@koonpeng Agreed, but this is entirely up to the build system. Generators must only provide enough information for it to be possible.
@koonpeng I think this is a related but orthogonal problem. Currently, only One potential solution would be to have packages use the |
@jacobperron Thanks for the useful insights!
Yeah, what I am building is not really a rosidl generator, but more of a bindings generator if it makes sense. The nodejs ecosystem is too different that makes it unrealistic to somehow tie the bindings into the ROS ecosystem. So my current solution if to crawl through the packages and generate bindings for each message at build time (of the consuming application), would this be the recommended approach for a third party "rcl" library that is distributed outside the ROS ecosystem? |
That last bit is key. Yes, you can definitely do this, but in doing so you're effectively leaving the ROS ecosystem. Another ROS package won't be able to depend on and use that generated code (at least not through standard channels). I think your use case simply requires ROS packages to export their artifacts in a way you can consume. And if you can afford doing this in Python, you get the interface parser for free. |
Sure, but perhaps we don't need to think of it as a re-build. All a generator package needs are the interface definition files (e.g.
This works well for developing rosidl generators. I guess the problem I'd like to see addressed is related to releasing and distributing packages for other languages. AFAIU, we can't distribute binaries via the buildfarm for languages that are not part of the "core" ROS installation. This means many ROS packages that depend on client libraries other than |
Yeah, this is the main problem I see as well. Even if we make it possible to release packages for other languages, it wouldn't help third party developers who don't have access to the build farm to include their rosidl generators when building the messages. |
This is not strictly true. Generators are provided with interface definition files plus their dependencies. Packages usually don't but could affect that generated code arbitrarily.
That breaks the assumption that packages are the smallest distributable unit. Packages built and/or released would no longer be static, but implicitly depend on the state of your installation. I also wonder how we would handle generators that are already present in the underlay and for which there's generated code already. Would re-building
Would these dedicated packages have their own
To make sure I understand, is this because third-party generators are not present in our core builds and so core interfaces are not generated for these other languages? That's a very good point I had not thought about. What if, for each interface package as we know it today, we had an interface-only package, a package for each language, and a "metapackage" using group dependencies and For instance, a CMake package that depends on Incidentally, by making each language-specific interface package target a specific set of build system tools, we can configure them more precisely. It'd definitely help me address a subtle problem we have with interface versioning today: the same interface package version can expose completely different APIs if it was built against different build system tool versions. We can land a breaking change w/o touching the package. |
Tagging the @ros2/team, my last proposal will spark controversy. |
I would actually try to do the opposite, make the tool as less aware of that organization model as possible. i.e.: I would like to have a tool that allows you to do the following: ros2 generate_idl IDL_FILE_1 ... IDL_FILE_N --depends-lib-dir LIB_DIR_1 ... LIB_DIRM --depends-include-dir INC_DIR_1 ... INC_DIR_M --depends-idl-dir IDL_DIR_1 ... IDL_DIR_M --language-plugins c cpp python java --output whatever/directory/i/want and then after having something in its more basic form working we could have sugar that is "ros build system" aware and does what you generally want in most cases, e.g.: ros2 generate_idl IDL_FILE_1 ... IDL_FILE_N --depends-packages PACKAGE_1 ... PACKAGE_M --language-plugins autodetect --output whatever/directory/i/want PS: I may be missing something and making the tool not "ros build system" aware is more complicated of what I think. |
IMO, this might be a good idea, so |
Definitely agree with trying to benefit third party generator support while refactoring, and I see the benefit to splitting the message packages into per-language packages with On decoupling from CMake, I think the scope might be smaller (but still not small 🙃) than what's been discussed so far. It might be limited to replacing the |
I fully agree with that. What I meant by sticking to the current model is that I'd much rather keep
This is outside the scope of this ticket, yes. But while I'm at this, I'm trying to land a design that, to the extent that it is possible, naturally fixes the problems we know we have today and supports the features we want in the future. I have the impression we won't be doing this again any time soon.
It's a bit more complicated than that. There will be something like a compiler (for interface generator and interface typesupport generators), but to achieve an:
the CMake build logic itself (which is by far the largest non-reusable portion of the pipeline) has to be put elsewhere (and then find its way back into CMake, or Bazel, or X). Anyhow, we'll discuss over a design doc soon. |
What do you mean by build logic? I thought most of the non-reusable stuff was the ament extension boilerplate, defining the output files from each generator, and invoking the code generators with CMake targets. |
👍 yes, retroactive rebuilds are a bad idea IMO. |
@sloretz Yeap! Plus configuring the build. Most packages share quite a bit of logic, but some have their own peculiarities (e.g. |
There is just one problem I see with this, when a thirdparty contributor releases a new message package or when an existing message is updated, |
Correct. There wouldn't be any special logic for these packages. That is, no automatic package creation, no automatic package releases. TBH I don't think that would scale (i.e. centralized CI/CD for every possible interface, language, and middleware combination), but I'm open to folks arguing otherwise. |
Alright, ros2/design#310 is up. Get there and bash it till its rough edges are gone! |
Discussion at ros2/design#310 is still on-going, but I think we all agree about the need for a unified code generation CLI. I'll open up a ticket to track the progress of that first milestone. |
Feature request
Feature description
As it stands, the
rosidl
interface generation pipeline is strongly coupled with CMake, and in particular withament_cmake
machinery. There's no easy way for external projects using different build systems (or build systems' generators) to generate their own interfaces, or provide their own generators and/or type support packages, but to delegate to a CMake project.Pushing as much logic and data (e.g. expected output from any given script) into reusable libraries and scripts would greatly ease integration with other build systems. Most code generation already takes place in separate Python scripts that could be used as a starting point.
In addition to that, though orthogonal, making type support dynamic loading machinery optional may further simplify the process for users that do not need such functionality.
The text was updated successfully, but these errors were encountered: