Skip to content

Timestamp correction algorithm, default 10ms instead of automatically calculating the timestamp. #425

Closed
@winlinvip

Description

@winlinvip

The key part of the current timestamp correction algorithm is:

        // calc the right diff by audio sample rate
        if (msg->is_audio() && sample_rate > 0) {
            delta = (int64_t)(delta * 1000.0 / sample_rate);
        } else if (msg->is_video() && frame_rate > 0) {
            delta = (int64_t)(delta * 1.0 / frame_rate);
        } else {
            delta = DEFAULT_FRAME_TIME_MS;
        }

That is, it will be calculated based on the fps and sample rate. But in reality, this is not accurate. In other words, the server cannot guarantee that a problematic stream has the correct timestamp. On the contrary, if the timestamp of the stream is incorrect, an obvious correction method should be used to indicate that the stream has an anomaly. Therefore, using the default 10ms is sufficient, and when the timestamp jumps, it is corrected to 10 milliseconds.

TRANS_BY_GPT3

Metadata

Metadata

Assignees

Labels

BugIt might be a bug.EnhancementImprovement or enhancement.TransByAITranslated by AI/GPT.

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions