Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update output queueing and batching logic #1510

Merged
merged 6 commits into from
Mar 28, 2022

Conversation

stefan-toubia
Copy link
Contributor

@stefan-toubia stefan-toubia commented Jan 27, 2022

What has Changed?

This refactors the logic for queueing and writing to the output window. #1509 points out that certain conditions can cause the output window to get stuck in isEditing mode and cause the queue to grow indefinitely.

The main change in this PR is to queue all appended output to the new EditQueue to try and eliminate some of these edge conditions causing the output to break.

This is still a draft because I still have some questions around the previous batching logic and OnAppendedCallback

Fixes #1189
Fixes #1509

My Calva PR Checklist

I have:

  • Read How to Contribute.
  • Directed this pull request at the dev branch. (Or have specific reasons to target some other branch.)
  • Made sure I have changed the PR base branch, so that it is not published. (Sorry for the nagging.)
  • Updated the [Unreleased] entry in CHANGELOG.md, linking the issue(s) that the PR is addressing.
  • Figured if anything about the fix warrants tests on Mac/Linux/Windows/Remote/Whatever, and either tested it there if so, or mentioned it in the PR.
  • Added to or updated docs in this branch, if appropriate
  • Tests
    • Tested the particular change
    • Figured if the change might have some side effects and tested those as well.
    • Smoke tested the extension as such.
    • Tested the VSIX built from the PR (so, after you've submitted the PR). You'll find the artifacts by clicking Show all checks in the CI section of the PR page, and then Details on the ci/circleci: build test.
  • Referenced the issue I am fixing/addressing in a commit message for the pull request.
    • If I am fixing the issue, I have used GitHub's fixes/closes syntax
    • If I am fixing just part of the issue, I have just referenced it w/o any of the "fixes” keywords.
  • Created the issue I am fixing/addressing, if it was not present.

Ping @PEZ, @bpringe


// Note that these onAppended callbacks are batched and called after all
// current batched entries have been written.
onAppendedCallbacks.forEach((onAppended) =>
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It wasn't immediately apparent to me what the intent was behind the previous batching behavior, so I held off on addressing it until I could clarify. Previously the batches had extra partitioning logic that ensured at most one onAppendedCallback existed in a batch. I could see two reasons for this -

  1. It's important to call onAppended immediately after that specific text was written and before any next entries are written
  2. It's important to provide the exact Location for that text entry

It looks like these callbacks are (exclusively?) used by evaluate to add selection decoration, so I'm under the impression that it's really (2) that we want here. Regardless, I think both cases do not require special batching logic, instead I can rearrange the contents of handleNextOutputQueueBatch to call each queued onAppended item immediately after writing it instead of after writing the whole batch.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My brain is fried right now so I will refrain from having any input on this right now. And I think @bpringe is the one who knows most about this. anyway. Also, iirc, @brdloush has been thinking (and also doing) things about this whole batching in general.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I believe the batching was added by @brdloush, so let's see what he has to say about this.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No response from @brdloush, I'd like to squash this bug. I can rework this to maintain the existing batching logic. The important part of this PR is removing applyingEdit. WDYT @PEZ and @bpringe, do you see any blockers?

Copy link
Member

@bpringe bpringe Feb 25, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to test this and review this more thoroughly when I have more time, unless @PEZ can get to it before me.

Comment on lines 294 to 304
editQueue.onQueued(() => {
setTimeout(async () => {
while (editQueue.size() > 0) {
try {
await handleNextOutputQueueBatch();
} catch (err) {
console.error("Error writing output: ", err);
}
}
}, 100);
});
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's some room for improvement on how the queue gets flushed but this gets the job done.

@PEZ
Copy link
Collaborator

PEZ commented Jan 27, 2022

Thanks for doing this!

@stefan-toubia
Copy link
Contributor Author

I'm just happy to have finally identified the cause of this issue, it's been an almost daily issue for me for the past year or so!

@bpringe
Copy link
Member

bpringe commented Feb 25, 2022

I also want to say thanks for spending time on this!

onAppended?: OnAppendedCallback;
};
type QueuedCallback = (item: EditQueueItem) => void;
export class EditQueue {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would you mind refactoring this code to be more functional (similar to how it was, but with the new functionality you've applied)? I'd prefer we not use a class for this. I think classes tend to increase complexity and increase the likelihood of bugs and/or hard to maintain code.

I could help with this if you want.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, maybe hold off on that. I left a comment on the issue about a repro case: #1509 (comment). I want to make sure we understand the cause better before we make changes.

@stefan-toubia
Copy link
Contributor Author

I'll be away for the next week or so, but I'll revisit this when I'm back.

@stefan-toubia
Copy link
Contributor Author

@PEZ @bpringe

I revisited this and cleaned it up for review. I kept the original batching logic for the most part, but cleaned up the append function by breaking it down into separate parts. The most important change here that will fix the issue of the output getting into a bad state is the addition of the try/catch to protect the outputPending flag.

I'm also pretty sure I was able to figure out how to reproduce this issue, described here #1641.

@stefan-toubia stefan-toubia requested a review from bpringe March 28, 2022 00:28
@stefan-toubia stefan-toubia marked this pull request as ready for review March 28, 2022 00:28
Copy link
Collaborator

@PEZ PEZ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for doing this! ❤️

The changes look good to me. I have tested the branch and things behave as I expect.

Please update the PR protocol adding a changelog entry, etc, and we're good to go, imo. 🙏

@stefan-toubia
Copy link
Contributor Author

@PEZ I've updated the changelog and tested the vsix that I built locally, but I was unsure how to use the artifact from ci/circleci: build. This downloaded as a zip file that extracts into a directory, I'm not sure how to convert that into a vsix file.

@stefan-toubia stefan-toubia requested a review from PEZ March 28, 2022 18:15
CHANGELOG.md Outdated
Comment on lines 5 to 8
## [Unreleased]

- Fix: [Results doc gets in a bad state and does not update](https://github.com/BetterThanTomorrow/calva/issues/1509)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because reasons, we can't have an empty row between the header and the list

Suggested change
## [Unreleased]
- Fix: [Results doc gets in a bad state and does not update](https://github.com/BetterThanTomorrow/calva/issues/1509)
## [Unreleased]
- Fix: [Results doc gets in a bad state and does not update](https://github.com/BetterThanTomorrow/calva/issues/1509)

Copy link
Collaborator

@PEZ PEZ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks again. This is good to be merged now. I can take care of the changelog thing after merge.

@PEZ
Copy link
Collaborator

PEZ commented Mar 28, 2022

@stefan-toubia we should update the PR template about that testing the VSIX is a bit optional and more necessary when changing in the CLJS code. As for the .zip file ending. Sometimes it downloads like that, I have no clue why. But a .vsix file is really just a zip-file with some special content in it. so changing the file extension is enough.

@PEZ PEZ merged commit d0d24c8 into BetterThanTomorrow:dev Mar 28, 2022
@PEZ
Copy link
Collaborator

PEZ commented Mar 28, 2022

🎉 ❤️ 🙏

Copy link
Member

@bpringe bpringe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good stuff!

@stefan-toubia stefan-toubia deleted the improved-output-queuing branch March 30, 2022 01:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants