Skip to content

Conversation

@ruben-arts
Copy link
Contributor

@ruben-arts ruben-arts commented Dec 24, 2025

Description

Assuming this might fix #5200

How Has This Been Tested?

I've been testing this with the https://github.com/ruben-arts/ros_workspace as that spawns multiple backends, and it was the only project I could get the error with. I've not been able to recreate the race condition, but they are tricky so this is not a promised fix.

I've extended the stress test by making it multi-env, with pypi sdist and solving for multiple platforms:

[workspace]
channels = [
    "https://prefix.dev/pixi-build-backends",
    "https://prefix.dev/conda-forge",
    "https://prefix.dev/robostack-humble",
]
platforms = ["osx-arm64", "linux-64", "win-64", "linux-aarch64"]
preview = ["pixi-build"]


[dependencies]
ros-humble-ros-core = ">=0.10.0,<0.11"
ros-humble-turtlesim = "*"
ros-humble-navigator = { path = "src/navigator/package.xml" }
ros-humble-navigator-py = { path = "src/navigator_py/package.xml" }
ros-humble-talker-py = { path = "src/talker-py/package.xml" }

[dev]
ros-humble-navigator = { path = "src/navigator/package.xml" }
ros-humble-navigator-py = { path = "src/navigator_py/package.xml" }
ros-humble-talker-py = { path = "src/talker-py/package.xml" }

[pypi-dependencies]
sdist = "*"

[feature.test.dependencies]
pytest = "*"

[environments]
test = ["test"]

I can't seem to break it anymore, but it was hard in the first place, so 🤞

AI Disclosure

  • This PR contains AI-generated content.
    • I have tested any AI-generated content in my PR.
    • I take responsibility for any AI-generated content in my PR.
      Tools: Claude

Checklist:

  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have added sufficient tests to cover my changes.

This type of issue is almost impossible to recreate in a test as it's flaky behavior.

Copy link
Contributor

@baszalmstra baszalmstra left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes this doesnt seem right. Instead we should just cancel the child task as well because most likely the parent errored or was itself cancelled.

@nichmor
Copy link
Contributor

nichmor commented Dec 24, 2025

Yes this doesnt seem right. Instead we should just cancel the child task as well because most likely the parent errored or was itself cancelled.

I've added a child cancellation token ( from parent token ).

@ruben-arts ruben-arts closed this Dec 24, 2025
@ruben-arts ruben-arts reopened this Dec 24, 2025
@lucascolley lucascolley added the bug Something isn't working label Dec 27, 2025
@ruben-arts ruben-arts added the test:extra_slow Run the extra slow tests label Dec 29, 2025
@ruben-arts ruben-arts enabled auto-merge (squash) December 29, 2025 09:46
@ruben-arts ruben-arts disabled auto-merge December 29, 2025 10:14
@ruben-arts ruben-arts merged commit 29d9d09 into prefix-dev:main Dec 29, 2025
108 of 109 checks passed
@baszalmstra
Copy link
Contributor

@nichmor When you have time can you point me to how this now fixes the issue. I dont understand why this would fix the issue. From my perspective the tasks were already cancelled anyway? The child cancellation tokens are also never cancelled manually are they?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working test:extra_slow Run the extra slow tests

Projects

None yet

Development

Successfully merging this pull request may close these issues.

the operation was cancelled error while solving

4 participants