fix: Scope fontify buffers per session and skip redundant display updates#113
Merged
fix: Scope fontify buffers per session and skip redundant display updates#113
Conversation
…ates
Fontify buffers (" *pi-fontify:<lang>*") were globally shared by
language. When two sessions wrote files in the same language, they
corrupted each other's buffer — producing garbled syntax highlighting
and triggering full O(n) re-fontification on every token instead of
incremental O(1) appends. This caused multi-second freezes during
write tool streaming in multi-session workflows.
Changes:
- Each chat buffer now tracks its own fontify buffers via a buffer-local
hash table (pi-coding-agent--fontify-buffers). Lookup is by hash
table, not by buffer name. Buffers are killed on session end.
- Skip the delete+reinsert cycle in display-tool-streaming-text when
the visible tail content is unchanged. Most LLM tokens extend a
partial line not shown in the rolling preview, making ~90% of deltas
no-ops for the buffer mutation path. fontify-sync still runs on every
delta to keep the fontify buffer current.
- Remove handle-message-update, which accumulated text deltas via
O(n²) concat into a field never read during streaming.
- Surface fontify-sync errors via message instead of swallowing
silently with (condition-case nil ... (error nil)).
Benchmarks (Emacs 30.1, 500-delta write tool, xvfb real windows):
Two-session victim (mean): 36,145µs → 1,537µs (23.5x faster)
Two-session victim (p95): 84,656µs → 2,753µs (30.7x faster)
Single-session write (p50): 641µs → 594µs (stable)
Bash streaming (mean): 79µs → 28µs (2.8x faster)
Fontify buffer corruption: 246 shrink events → 0
Fontify buffer leaks: confirmed → fixed
Realistic partial-token benchmark (1,623 tokens, 171 lines of Python):
Display updates: 171 out of 1,623 deltas (89.5% skip rate)
Mean per-delta: 1,063µs (includes 89.5% near-free no-ops)
dnouri
added a commit
that referenced
this pull request
Feb 25, 2026
…ates (#113) Fontify buffers (" *pi-fontify:<lang>*") were globally shared by language. When two sessions wrote files in the same language, they corrupted each other's buffer — producing garbled syntax highlighting and triggering full O(n) re-fontification on every token instead of incremental O(1) appends. This caused multi-second freezes during write tool streaming in multi-session workflows. Changes: - Each chat buffer now tracks its own fontify buffers via a buffer-local hash table (pi-coding-agent--fontify-buffers). Lookup is by hash table, not by buffer name. Buffers are killed on session end. - Skip the delete+reinsert cycle in display-tool-streaming-text when the visible tail content is unchanged. Most LLM tokens extend a partial line not shown in the rolling preview, making ~90% of deltas no-ops for the buffer mutation path. fontify-sync still runs on every delta to keep the fontify buffer current. - Remove handle-message-update, which accumulated text deltas via O(n²) concat into a field never read during streaming. - Surface fontify-sync errors via message instead of swallowing silently with (condition-case nil ... (error nil)). Benchmarks (Emacs 30.1, 500-delta write tool, xvfb real windows): Two-session victim (mean): 36,145µs → 1,537µs (23.5x faster) Two-session victim (p95): 84,656µs → 2,753µs (30.7x faster) Single-session write (p50): 641µs → 594µs (stable) Bash streaming (mean): 79µs → 28µs (2.8x faster) Fontify buffer corruption: 246 shrink events → 0 Fontify buffer leaks: confirmed → fixed Realistic partial-token benchmark (1,623 tokens, 171 lines of Python): Display updates: 171 out of 1,623 deltas (89.5% skip rate) Mean per-delta: 1,063µs (includes 89.5% near-free no-ops)
dnouri
added a commit
that referenced
this pull request
Feb 26, 2026
…ates (#113) Fontify buffers (" *pi-fontify:<lang>*") were globally shared by language. When two sessions wrote files in the same language, they corrupted each other's buffer — producing garbled syntax highlighting and triggering full O(n) re-fontification on every token instead of incremental O(1) appends. This caused multi-second freezes during write tool streaming in multi-session workflows. Changes: - Each chat buffer now tracks its own fontify buffers via a buffer-local hash table (pi-coding-agent--fontify-buffers). Lookup is by hash table, not by buffer name. Buffers are killed on session end. - Skip the delete+reinsert cycle in display-tool-streaming-text when the visible tail content is unchanged. Most LLM tokens extend a partial line not shown in the rolling preview, making ~90% of deltas no-ops for the buffer mutation path. fontify-sync still runs on every delta to keep the fontify buffer current. - Remove handle-message-update, which accumulated text deltas via O(n²) concat into a field never read during streaming. - Surface fontify-sync errors via message instead of swallowing silently with (condition-case nil ... (error nil)). Benchmarks (Emacs 30.1, 500-delta write tool, xvfb real windows): Two-session victim (mean): 36,145µs → 1,537µs (23.5x faster) Two-session victim (p95): 84,656µs → 2,753µs (30.7x faster) Single-session write (p50): 641µs → 594µs (stable) Bash streaming (mean): 79µs → 28µs (2.8x faster) Fontify buffer corruption: 246 shrink events → 0 Fontify buffer leaks: confirmed → fixed Realistic partial-token benchmark (1,623 tokens, 171 lines of Python): Display updates: 171 out of 1,623 deltas (89.5% skip rate) Mean per-delta: 1,063µs (includes 89.5% near-free no-ops)
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Fontify buffers (
" *pi-fontify:<lang>*") were globally shared by language. When two sessions wrote files in the same language, they corrupted each other's buffer — producing garbled syntax highlighting and triggering full O(n) re-fontification on every token instead of incremental O(1) appends. This caused multi-second freezes during write tool streaming in multi-session workflows.Changes
Each chat buffer now tracks its own fontify buffers via a buffer-local hash table (
pi-coding-agent--fontify-buffers). Lookup is by hash table, not by buffer name. Buffers are killed on session end.Skip the delete+reinsert cycle in
display-tool-streaming-textwhen the visible tail content is unchanged. Most LLM tokens extend a partial line not shown in the rolling preview, making ~90% of deltas no-ops for the buffer mutation path.fontify-syncstill runs on every delta to keep the fontify buffer current.Remove
handle-message-update, which accumulated text deltas via O(n²) concat into a field never read during streaming.Surface
fontify-syncerrors viamessageinstead of swallowing silently with(condition-case nil ... (error nil)).Benchmarks
Emacs 30.1, 500-delta write tool, xvfb real windows:
Realistic partial-token benchmark (1,623 tokens, 171 lines of Python):
Tests