-
-
Notifications
You must be signed in to change notification settings - Fork 223
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update output queueing and batching logic #1510
Update output queueing and batching logic #1510
Conversation
src/results-output/results-doc.ts
Outdated
|
||
// Note that these onAppended callbacks are batched and called after all | ||
// current batched entries have been written. | ||
onAppendedCallbacks.forEach((onAppended) => |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It wasn't immediately apparent to me what the intent was behind the previous batching behavior, so I held off on addressing it until I could clarify. Previously the batches had extra partitioning logic that ensured at most one onAppendedCallback
existed in a batch. I could see two reasons for this -
- It's important to call
onAppended
immediately after that specific text was written and before any next entries are written - It's important to provide the exact
Location
for that text entry
It looks like these callbacks are (exclusively?) used by evaluate
to add selection decoration, so I'm under the impression that it's really (2) that we want here. Regardless, I think both cases do not require special batching logic, instead I can rearrange the contents of handleNextOutputQueueBatch
to call each queued onAppended
item immediately after writing it instead of after writing the whole batch.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I believe the batching was added by @brdloush, so let's see what he has to say about this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd like to test this and review this more thoroughly when I have more time, unless @PEZ can get to it before me.
src/results-output/results-doc.ts
Outdated
editQueue.onQueued(() => { | ||
setTimeout(async () => { | ||
while (editQueue.size() > 0) { | ||
try { | ||
await handleNextOutputQueueBatch(); | ||
} catch (err) { | ||
console.error("Error writing output: ", err); | ||
} | ||
} | ||
}, 100); | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's some room for improvement on how the queue gets flushed but this gets the job done.
Thanks for doing this! |
I'm just happy to have finally identified the cause of this issue, it's been an almost daily issue for me for the past year or so! |
I also want to say thanks for spending time on this! |
src/results-output/results-doc.ts
Outdated
onAppended?: OnAppendedCallback; | ||
}; | ||
type QueuedCallback = (item: EditQueueItem) => void; | ||
export class EditQueue { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would you mind refactoring this code to be more functional (similar to how it was, but with the new functionality you've applied)? I'd prefer we not use a class for this. I think classes tend to increase complexity and increase the likelihood of bugs and/or hard to maintain code.
I could help with this if you want.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, maybe hold off on that. I left a comment on the issue about a repro case: #1509 (comment). I want to make sure we understand the cause better before we make changes.
I'll be away for the next week or so, but I'll revisit this when I'm back. |
81947b6
to
c5d6106
Compare
I revisited this and cleaned it up for review. I kept the original batching logic for the most part, but cleaned up the I'm also pretty sure I was able to figure out how to reproduce this issue, described here #1641. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for doing this! ❤️
The changes look good to me. I have tested the branch and things behave as I expect.
Please update the PR protocol adding a changelog entry, etc, and we're good to go, imo. 🙏
@PEZ I've updated the changelog and tested the vsix that I built locally, but I was unsure how to use the artifact from |
CHANGELOG.md
Outdated
## [Unreleased] | ||
|
||
- Fix: [Results doc gets in a bad state and does not update](https://github.com/BetterThanTomorrow/calva/issues/1509) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because reasons, we can't have an empty row between the header and the list
## [Unreleased] | |
- Fix: [Results doc gets in a bad state and does not update](https://github.com/BetterThanTomorrow/calva/issues/1509) | |
## [Unreleased] | |
- Fix: [Results doc gets in a bad state and does not update](https://github.com/BetterThanTomorrow/calva/issues/1509) | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks again. This is good to be merged now. I can take care of the changelog thing after merge.
@stefan-toubia we should update the PR template about that testing the VSIX is a bit optional and more necessary when changing in the CLJS code. As for the |
🎉 ❤️ 🙏 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good stuff!
What has Changed?
This refactors the logic for queueing and writing to the output window. #1509 points out that certain conditions can cause the output window to get stuck in
isEditing
mode and cause the queue to grow indefinitely.The main change in this PR is to queue all appended output to the new
EditQueue
to try and eliminate some of these edge conditions causing the output to break.This is still a draft because I still have some questions around the previous batching logic and
OnAppendedCallback
Fixes #1189
Fixes #1509
My Calva PR Checklist
I have:
dev
branch. (Or have specific reasons to target some other branch.)published
. (Sorry for the nagging.)[Unreleased]
entry inCHANGELOG.md
, linking the issue(s) that the PR is addressing.ci/circleci: build
test.Ping @PEZ, @bpringe