Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Users should be able to configure max HTTP concurrent connections #27066

Closed
splunkericl opened this issue Sep 21, 2023 · 4 comments
Closed

Users should be able to configure max HTTP concurrent connections #27066

splunkericl opened this issue Sep 21, 2023 · 4 comments
Labels
enhancement New feature or request needs triage New item requiring triage receiver/splunkhec

Comments

@splunkericl
Copy link
Contributor

Component(s)

receiver/splunkhec

Is your feature request related to a problem? Please describe.

As a user of splunk hec receiver, I want to be able to configure the max concurrent connections my http server can serve. This protects the server from being bombarded by requests and cause OOM.

As splunkhecreceiver uses HTTPServerSettings, which is shared inside the collector repo. This change will apply to any receivers that spawn a http server.

Describe the solution you'd like

A new config option should be added in HTTPServerSettings:

type HTTPServerSettings struct {
...
    MaxConcurrentConnections `mapstructure:"max_concurrent_connections"`
}

The number defaults to 0, which means no limit. If a number is set for this field, a rate limiting listener would be added.

func (hss *HTTPServerSettings) ToListener() (net.Listener, error) {
...
if hss.MaxConcurrentConnections != 0 {
   listener = netutil.LimitListener(l, hss.MaxConcurrentConnections)
}
}

New instrument should be added for the connection also. A gauge should be added to tracks the number of connections opened.

Describe alternatives you've considered

No response

Additional context

Limit Listener: https://pkg.go.dev/golang.org/x/net/netutil#LimitListener

@splunkericl splunkericl added enhancement New feature or request needs triage New item requiring triage labels Sep 21, 2023
@github-actions
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@dmitryax
Copy link
Member

@splunkericl PTAL at open-telemetry/opentelemetry-collector#8632. We are working towards moving memory_limiter to the receiver side. The goal is pretty similar. If a certain memory threshold is reached, the collector won't be accepting any connections. I think that solution should also work for your use case, so we don't need to introduce any new configuration to the http server.

@splunkericl
Copy link
Contributor Author

open-telemetry/opentelemetry-collector#8632 is definitely better than what is proposed here. In the end we want to protect the otel collector host from running out of memory.

@crobert-1
Copy link
Member

I'm going to close this issue in favor of open-telemetry/opentelemetry-collector#8632?, as it looks like that would accomplish the goal here. Let me know if I misunderstood or if there's anything else to discuss though!

@crobert-1 crobert-1 closed this as not planned Won't fix, can't repro, duplicate, stale Nov 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request needs triage New item requiring triage receiver/splunkhec
Projects
None yet
Development

No branches or pull requests

3 participants