Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[pkg/stanza] Reconsider aggresive sampling policy for logger #23801

Closed
BinaryFissionGames opened this issue Jun 27, 2023 · 6 comments · Fixed by #29886
Closed

[pkg/stanza] Reconsider aggresive sampling policy for logger #23801

BinaryFissionGames opened this issue Jun 27, 2023 · 6 comments · Fixed by #29886
Assignees
Labels
enhancement New feature or request pkg/stanza

Comments

@BinaryFissionGames
Copy link
Contributor

Component(s)

pkg/stanza

Describe the issue you're reporting

Stanza pipelines are built with a very aggressive sampling policy.

https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/2d23e4d9e0eb313da5d192e1066f444d19b8601f/pkg/stanza/pipeline/config.go#L38C2-L38C2

If I'm reading and understanding this correctly, this means that every second, 1 initial log of a particular message is allowed, then only 1/10000 logs of that particular message is allowed.

This seems overly aggressive. In fact, it actually makes the file consumer only log the first log it finds on startup here:
https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/2d23e4d9e0eb313da5d192e1066f444d19b8601f/pkg/stanza/fileconsumer/file.go#L188C35-L188C35

Meaning, if you have 10 files that are picked up on startup, only one of them is logged. That poses a challenge for debugging whether a file is picked up at all.

In my opinion, we should rely on the logger already being sampled. The collector already supports sampling. It also samples by default with initial = 100, and thereafter = 100.

@BinaryFissionGames BinaryFissionGames added the needs triage New item requiring triage label Jun 27, 2023
@github-actions
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@djaglowski djaglowski added enhancement New feature or request and removed needs triage New item requiring triage labels Jun 28, 2023
@djaglowski
Copy link
Member

@BinaryFissionGames I think this is a good idea. The behavior is a holdover from when stanza was an independent agent. I see no reason we should be overriding global behaviors at this point.

@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Oct 30, 2023
bogdandrutu pushed a commit that referenced this issue Dec 14, 2023
**Description:**
Remove sampling policy from logger

**Link to tracking Issue:**
Fixes #23801
@codeboten
Copy link
Contributor

After #29886 was merged, I noted a change in the benchmarks. It looks like a 5-10% increase in CPU usage for at least some of the benchmark tests:

Screenshot 2023-12-19 at 3 15 16 PM

@atoulme atoulme reopened this Dec 19, 2023
@github-actions github-actions bot removed the Stale label Dec 20, 2023
cparkins pushed a commit to AmadeusITGroup/opentelemetry-collector-contrib that referenced this issue Jan 10, 2024
**Description:**
Remove sampling policy from logger

**Link to tracking Issue:**
Fixes open-telemetry#23801
@djaglowski djaglowski self-assigned this Jan 18, 2024
@djaglowski
Copy link
Member

I spent quite some time looking into this today and found nothing conclusive. However, there was a preexisting problem, fixed by #30301, which would may have contributed to excess logging. I can't see the full history but it appears these two tests specifically show better performance than before this change. I'm closing this issue but we feel free to reopen if there is more to discuss.

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request pkg/stanza
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants