Skip to content

Conversation

@Kaedriz
Copy link
Contributor

@Kaedriz Kaedriz commented Nov 19, 2025

Description

Prototype

Stream latency reducer, with option to customize minimal & maximum playback speed up rate, and threshold from which this speed up will work.
Minimum speed is used near threshold, and gradually increases to max speed as latency goes to extreme values (threshold * 3, for now).
If reducing is used, playback rate is shown beside latency (ex. 7.23s (x1.05) )

For now only works on Kick, and settings are still static (need to figure out how to implement them)

I have though of making it out as a separate component, but in the end integrated it with an existing latency module. It is still lacking a separate if-checks from it, though.

Testing

Select all the environments you tested this PR with:

Twitch

  • BetterTTV (BTTV)
  • FrankerFaceZ (FFZ)
  • 7TV
  • Native Twitch

Kick

  • 7TV
  • Nipahtv (NTV)
  • Native Kick

Related Issues

Related to #104

@Kaedriz
Copy link
Contributor Author

Kaedriz commented Nov 19, 2025

@igorovh is this implementation okay? (aside from being unfinished)
I don't think this needs a separate module, just don't know if this can be done like this.

I have changed latency calculation a bit too, to better accommodate buffer spikes, which often caused a jump of 1 secs, now it shifts slightly. The obvious con is large sudden latency will come with delay, which isn't that bad.

But I need a bit of help with working out settings.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR introduces a prototype stream latency reducer feature exclusively for the Kick platform. The feature dynamically adjusts video playback speed to minimize stream delay, with the playback rate displayed alongside the latency indicator when active.

Key Changes:

  • Implements adaptive playback speed adjustment that gradually increases from minimum to maximum rate based on latency thresholds
  • Adds four new settings for controlling the latency reducer: enable/disable toggle, min/max playback rates, and latency threshold
  • Enhances latency calculation with a 10-sample moving average for smoother measurements

Reviewed Changes

Copilot reviewed 6 out of 6 changed files in this pull request and generated 8 comments.

Show a summary per file
File Description
src/types/platforms/kick/kick.settings.types.ts Adds type definitions for four new latency reducer settings
src/shared/components/latency/latency.component.tsx Updates component to accept and display playback rate alongside latency, adds width: max-content for proper sizing
src/platforms/twitch/modules/stream-latency/stream-latency.module.tsx Adds placeholder playbackRate signal to maintain compatibility with updated component interface
src/platforms/kick/modules/stream-latency/stream-latency.module.tsx Implements core latency reduction logic with playback rate adjustment, latency averaging, and increased update frequency (500ms)
src/platforms/kick/modules/settings/settings.module.tsx Defines UI settings for the four new latency reducer configuration options
src/platforms/kick/kick.constants.ts Sets default values for latency reducer settings (disabled by default, 1.03-1.1x rate range, 5s threshold)

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.


if (this.updateInterval) clearInterval(this.updateInterval);
this.updateInterval = setInterval(() => this.updateLatency(), 1000);
this.updateInterval = setInterval(() => this.updateLatency(), 500);
Copy link

Copilot AI Nov 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The update interval has been reduced from 1000ms to 500ms, which doubles the update frequency. While this provides more responsive latency reduction, consider whether this increased polling frequency is necessary. If the goal is smoother latency sampling, the current 10-sample average already provides good smoothing. The increased frequency may impact performance without significant benefit.

Suggested change
this.updateInterval = setInterval(() => this.updateLatency(), 500);
this.updateInterval = setInterval(() => this.updateLatency(), 1000);

Copilot uses AI. Check for mistakes.
Comment on lines 68 to 81
if (this.latencyCounter.value > this.threshold.value) {
const min = 1.03;
const max = 1.1;
const maxSpeedLatency = this.threshold.value * 3;

video.playbackRate =
this.latencyCounter.value > maxSpeedLatency
? max
: min +
((max - min) * (this.latencyCounter.value - this.threshold.value)) /
(maxSpeedLatency - this.threshold.value);
} else {
video.playbackRate = 1;
}
Copy link

Copilot AI Nov 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The latency reducer feature uses hard-coded values instead of the settings defined in streamLatencyReducerMinRate, streamLatencyReducerMaxRate, and streamLatencyReducerThreshold. The hard-coded values min = 1.03, max = 1.1, and this.threshold.value = 5 should be replaced with values loaded from settings.

Additionally, there's no check for streamLatencyReducerEnabled to control whether the feature is active. The playback rate adjustment should only occur when this setting is enabled.

Copilot uses AI. Check for mistakes.

this.latencyTimings.value.push(bufferEnd - currentTime);

if (!this.updateInterval) return 0;
Copy link

Copilot AI Nov 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The logic check if (!this.updateInterval) return 0; is incorrect. This check returns 0 when updateInterval is falsy, but updateInterval should always be set since it's assigned in the run() method before any calls to computeLatency(). If the intention is to check whether the interval has been initialized, this would be better placed earlier in the flow. Additionally, returning 0 instead of the actual latency value could mislead the caller into thinking there's no latency when the interval might not be set.

Suggested change
if (!this.updateInterval) return 0;

Copilot uses AI. Check for mistakes.
Comment on lines 89 to 93
this.latencyTimings.value.push(bufferEnd - currentTime);

if (!this.updateInterval) return 0;
const numberOfSamples = 10;
if (this.latencyTimings.value.length > numberOfSamples) this.latencyTimings.value.shift();
Copy link

Copilot AI Nov 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The array mutation using push() on the signal's value directly (line 89) and shift() (line 93) is inefficient and may not properly trigger signal updates. Instead, create a new array to ensure proper reactivity:

this.latencyTimings.value = [...this.latencyTimings.value, bufferEnd - currentTime];
if (this.latencyTimings.value.length > numberOfSamples) {
  this.latencyTimings.value = this.latencyTimings.value.slice(1);
}
Suggested change
this.latencyTimings.value.push(bufferEnd - currentTime);
if (!this.updateInterval) return 0;
const numberOfSamples = 10;
if (this.latencyTimings.value.length > numberOfSamples) this.latencyTimings.value.shift();
const numberOfSamples = 10;
const newTimings = [...this.latencyTimings.value, bufferEnd - currentTime];
this.latencyTimings.value =
newTimings.length > numberOfSamples ? newTimings.slice(1) : newTimings;
if (!this.updateInterval) return 0;

Copilot uses AI. Check for mistakes.
Comment on lines 86 to 109
{
id: "streamLatencyReducerMinRate",
title: "Minimum Playback Rate",
description: "The minimum playback rate the stream will be speeded up to.",
type: "number",
tabIndex: tabIndexes.General,
requiresRefreshToDisable: true,
},
{
id: "streamLatencyReducerMaxRate",
title: "Maximum Playback Rate",
description: "The maximum playback rate the stream will be speeded up to.",
type: "number",
tabIndex: tabIndexes.General,
requiresRefreshToDisable: true,
},
{
id: "streamLatencyReducerThreshold",
title: "Latency Threshold",
description: "The latency threshold (in seconds) at which the playback rate will be speeded up.",
type: "number",
tabIndex: tabIndexes.General,
requiresRefreshToDisable: true,
},
Copy link

Copilot AI Nov 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The number input settings for streamLatencyReducerMinRate, streamLatencyReducerMaxRate, and streamLatencyReducerThreshold should have validation constraints. Consider adding min, max, and step properties similar to the chatImagesSize setting (lines 154-158). For example:

  • streamLatencyReducerMinRate: min=1.0, max=2.0, step=0.01
  • streamLatencyReducerMaxRate: min=1.0, max=2.0, step=0.01
  • streamLatencyReducerThreshold: min=1, max=30, step=1

Without these constraints, users could enter invalid values that could break video playback.

Copilot uses AI. Check for mistakes.
<StatusDot isLive={isLive.value} />
{isLive.value ? `Latency: ${formatLatency()}` : "OFFLINE"}
{isLive.value
? `Latency: ${formatLatency()} ${playbackRate.value !== 1 ? ` (x${playbackRate.value})` : ""}`
Copy link

Copilot AI Nov 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The conditional formatting in the template literal has an extra space before the playback rate display. The expression (x${playbackRate.value}) includes a leading space that results in double spacing when concatenated. Consider:

{isLive.value
  ? `Latency: ${formatLatency()}${playbackRate.value !== 1 ? ` (x${playbackRate.value})` : ""}`
  : "OFFLINE"}
Suggested change
? `Latency: ${formatLatency()} ${playbackRate.value !== 1 ? ` (x${playbackRate.value})` : ""}`
? `Latency: ${formatLatency()}${playbackRate.value !== 1 ? ` (x${playbackRate.value})` : ""}`

Copilot uses AI. Check for mistakes.
@@ -48,7 +50,9 @@ export function LatencyComponent({ click, latencyCounter, isLive }: LatencyCompo
<TooltipComponent content={"Stream delay. Click to refresh player."} position={"bottom"}>
Copy link

Copilot AI Nov 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The tooltip text "Stream delay. Click to refresh player." should be updated to reflect that the component now also displays the playback rate when latency reduction is active. Consider something like: "Stream delay and playback rate (when adjusted). Click to refresh player."

Suggested change
<TooltipComponent content={"Stream delay. Click to refresh player."} position={"bottom"}>
<TooltipComponent content={"Stream delay and playback rate (when adjusted). Click to refresh player."} position={"bottom"}>

Copilot uses AI. Check for mistakes.
Comment on lines +95 to +97
return (
this.latencyTimings.value.reduce((accumulator, currentValue) => accumulator + currentValue, 0) /
this.latencyTimings.value.length
Copy link

Copilot AI Nov 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The averaging calculation will fail when latencyTimings.value.length is 0, resulting in a division by zero and returning NaN. Although this shouldn't happen in normal flow, it's safer to add a guard:

const length = this.latencyTimings.value.length;
if (length === 0) return -1;
return this.latencyTimings.value.reduce((accumulator, currentValue) => accumulator + currentValue, 0) / length;
Suggested change
return (
this.latencyTimings.value.reduce((accumulator, currentValue) => accumulator + currentValue, 0) /
this.latencyTimings.value.length
const length = this.latencyTimings.value.length;
if (length === 0) return -1;
return (
this.latencyTimings.value.reduce((accumulator, currentValue) => accumulator + currentValue, 0) / length

Copilot uses AI. Check for mistakes.
@igorovh
Copy link
Member

igorovh commented Nov 19, 2025

@Kaedriz

Hey!

It looks great! Since I see you've defined the settings both in the module and in the types, we just need to make sure we cover two things. I'm taking the examples from chat mention sound module, so you could also look into that file if you need to.

  1. Get it from settings on initialization (After this, we have initialized values from settings):
async initialize() {
    this.defaultSound = await this.commonUtils().getAssetFile(this.workerService(), "modules/mention-sound.ogg", "");
    this.updateAudioSource(await this.settingsService().getSettingsKey("chatMentionSoundSource"));
    this.updateAudioVolume((await this.settingsService().getSettingsKey("chatMentionSoundVolume")) ?? 50);
}
  1. Listen for settings updates (Here we are handling when the user changes these values in settings):
{
    type: "event",
    key: "chat-mention-sound",
    event: "twitch:settings:chatMentionSoundSource",
    callback: this.updateAudioSource.bind(this),
},
{
    type: "event",
    key: "chat-mention-sound",
    event: "twitch:settings:chatMentionSoundVolume",
    callback: this.updateAudioVolume.bind(this),
},

Also, please take a look at the comments from the Copilot review. But honestly, it looks very good overall!

If you have any more questions, feel free to ask!

@Kaedriz
Copy link
Contributor Author

Kaedriz commented Nov 21, 2025

Is there a memory leak somewhere, though? This has been bugging me out as I leave DevTools open on Twitch to try to integrate it there too, but I noticed gradual increase of RAM usage, to the point of system lag.
Firefox:
image

2GB increase in just 1 miunte!

I checked if this is upstream, but no, it only happens on this branch. Kick/Twitch alike.

And how do you load dev version of extension in chromium browsers?

image

I'm trying to load unpacked, but it just inactivates itself.

I will integrate these settings later, thanks for response, for now I'm figuring how to disable Twitch native playback rate changing, and deal with these problems.

@jvxz
Copy link
Contributor

jvxz commented Nov 22, 2025

@Kaedriz @igorovh if either could grant me permission to contribute to this PR it would be much appreciated! I got some core logic working with some elements based on Kaedriz's work. feel free to check it out here https://github.com/jvxz/enhancer/tree/latency-reducer

@Kaedriz
Copy link
Contributor Author

Kaedriz commented Nov 22, 2025

@Kaedriz @igorovh if either could grant me permission to contribute to this PR it would be much appreciated! I got some core logic working with some elements based on Kaedriz's work. feel free to check it out here jvxz/enhancer@latency-reducer

I did send you a invite to fork, idk if this is it.

@igorovh
Copy link
Member

igorovh commented Nov 22, 2025

@Kaedriz I didn't check the memory usage, but I think it's because of the opened DevTools and the logs appearing there. I had this issue too, but after closing DevTools, the memory spiked down dramatically, so I didn't test it further.

Regarding loading the dev version, all you need to do is run bun run dev and then click the refresh button on the right side. Also, keep in mind that in the newest version of Chrome, you need to allow Local network access. This is a new requirement because we are accessing a localhost endpoint for the extension.

image

@igorovh
Copy link
Member

igorovh commented Nov 22, 2025

@jvxz I think the simplest solution will be to just fork Kaedriz's version and create a pull request with the destination set to his branch.

@jvxz
Copy link
Contributor

jvxz commented Nov 22, 2025

@jvxz I think the simplest solution will be to just fork Kaedriz's version and create a pull request with the destination set to his branch.

I'll go ahead and do this

@jvxz
Copy link
Contributor

jvxz commented Nov 22, 2025

Regarding loading the dev version, all you need to do is run bun run dev and then click the refresh button on the right side. Also, keep in mind that in the newest version of Chrome, you need to allow Local network access. This is a new requirement because we are accessing a localhost endpoint for the extension.

off-topic, but I do think it is worth mentioning this in the development section of the readme. I spent about an hour trying to get the dev server working and I had not realized I needed this toggled. I can make a separate PR for this later if you are interested

@igorovh
Copy link
Member

igorovh commented Jan 7, 2026

@Kaedriz i think we can move it to the experimental tab and release it. i will also ping my guys to test it out then, and i already requested Copilot to review it

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 16 out of 17 changed files in this pull request and generated 21 comments.

Comments suppressed due to low confidence (2)

src/platforms/twitch/modules/settings/settings.module.tsx:228

  • The description mentions "Disabled without Low Latency Mode" which is specific to Twitch's localStorage check. Consider clarifying that this applies only when Twitch's Low Latency Mode is enabled.
				title: "Watchtime List",
				description: "Watchtime List",
				type: "text",

src/platforms/kick/modules/settings/settings.module.tsx:177

  • The description mentions "Disabled without Low Latency Mode" but this only applies to Twitch (which checks localStorage for "lowLatencyModeEnabled"). Kick doesn't have this check, so the description is misleading for Kick users. Consider platform-specific descriptions or removing this note from Kick settings.
			{
				id: "export-import",
				title: "Export/Import Data",

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@igorovh
Copy link
Member

igorovh commented Jan 7, 2026

/gemini review

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a prototype for a stream latency reducer for both Kick and Twitch platforms. The feature aims to decrease latency by dynamically adjusting the video playback speed. My review focuses on improving code quality, correctness, and performance. Key feedback includes correcting a critical class naming error, eliminating duplicated code between modules, optimizing performance by reducing redundant asynchronous calls within intervals, and improving error handling by logging exceptions instead of using empty catch blocks. There are also suggestions to improve clarity in settings descriptions and code logic.

Kaedriz and others added 21 commits January 7, 2026 21:23
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
@Kaedriz
Copy link
Contributor Author

Kaedriz commented Jan 7, 2026

Okay, I think I got most of them.
Latency calculation could be maybe modularized further, but I will leave it for now.

Moved settings to experimental.

image

@igorovh
Copy link
Member

igorovh commented Jan 7, 2026

thanks guys! i will take one more look at this later this week. if everything is fine i will release it, thanks again!

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot encountered an error and was unable to review this pull request. You can try again by re-requesting a review.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants