Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[core][cgraph] Collapse other params into max_inflight_executions and adjust execution_index counting #49565
[core][cgraph] Collapse other params into max_inflight_executions and adjust execution_index counting #49565
Changes from 1 commit
c5b58b1
64decc6
07cf5c4
af80d92
5504345
1a330a2
dc7c38c
05ab097
23d3b07
b754d82
fdcc427
5c19746
2e71575
757b33e
023495a
a08cafd
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this seems prone to issues, ex. if destruction is not in the same order execute was called
ray/python/ray/experimental/compiled_dag_ref.py
Lines 103 to 104 in ea4e315
The continuation of this in the SynchronousReader also just releases all buffers, so anything that was executed up to that point would also be gone ex. this script fails
this is also inconsistent behavior vs. async
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah that looks problematic. cc @kevin85421 who reviewed the skip deserialization PR.
One idea is to deserialize the prior indexes and save the results to buffer, and only skip deserialization for the exact index:
A ref goes out of scope => first check if result in buffer, if so delete, otherwise do above.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ya that sounds ok, there's the overhead of needing to get the prior indices though. But not really any way to get around that. One thing is that python might not call del in order so we may be forcing deserialization for refs that are also being deleted but not sure if we can do anything about that.
Should I open an issue to track, is it a beta blocker or just do this in a follow-up pr, no need for issue creation?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should ensure correctness first. But open an issue for optimization. We can leave the optimization out of beta for now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like this one is not addressed yet?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
opened #49781 as fix and #49782 as issue for future