Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[exporter/googlecloud] OpAMP extension panics if googlecloud exporter is configured #34628

Closed
BinaryFissionGames opened this issue Aug 13, 2024 · 9 comments
Labels

Comments

@BinaryFissionGames
Copy link
Contributor

Component(s)

exporter/googlecloud

What happened?

Description

If the google cloud exporter is configured, the OpAMP extension may panic when it tries to compose the effective config.

panic: cannot marshal type: func() []option.ClientOption [recovered]
	panic: cannot marshal type: func() []option.ClientOption

The issue is that the functional options in the config struct cannot be marshalled, so the yaml marshal panics. To fix this, we'll need to put mapstructure:"-" tags on the functional arguments in the googlecloud exporter config structs.

I also think the opamp extension could probably handle this situation more gracefully, by at the very least catching the panic.

Collector version

v0.106.1

Environment information

No response

OpenTelemetry Collector configuration

No response

Log output

No response

Additional context

No response

@BinaryFissionGames BinaryFissionGames added bug Something isn't working needs triage New item requiring triage labels Aug 13, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@JaredTan95
Copy link
Member

Can you post a yaml configuration with minimal produce this issue?

@dashpole
Copy link
Contributor

I'll make sure the tags are added in the next release! I opened GoogleCloudPlatform/opentelemetry-operations-go#879

@BinaryFissionGames
Copy link
Contributor Author

Thanks @dashpole! I think it's not just this option, but basically anything that can't be marshalled by the yaml package that will end up causing problems.

For reference, here's a small config that exhibits the issue (you do not need an OpAMP server running to reproduce):

receivers:
  hostmetrics:
    collection_interval: 15s
    scrapers:
      load:

exporters:
  googlecloud:
    project: "${GCP_PROJECT}"

extensions:
  opamp:
    capabilities:
      reports_effective_config: true
    server:
      ws:
        endpoint: "ws://localhost:1111/v1/opamp"
        tls:
          insecure: true

service:
  extensions: [opamp]
  pipelines:
    metrics:
      receivers: [hostmetrics]
      exporters: [googlecloud]

@dashpole
Copy link
Contributor

@BinaryFissionGames adding the mapstructure tag didn't seem to fix it: GoogleCloudPlatform/opentelemetry-operations-go#880. Is the test case I added correct?

@BinaryFissionGames
Copy link
Contributor Author

BinaryFissionGames commented Aug 13, 2024

@BinaryFissionGames adding the mapstructure tag didn't seem to fix it: GoogleCloudPlatform/opentelemetry-operations-go#880. Is the test case I added correct?

There's a step before the marshal that adds it to a confmap. I think this is what the test should be:

func TestMarshal(t *testing.T) {
	config := DefaultConfig()

	cm := confmap.New()
	err := cm.Marshal(config)
	if err != nil {
		t.Fatal(err)
	}

	_, err = yaml.Marshal(cm.ToStringMap())
	if err != nil {
		t.Fatal(err)
	}
}

@dashpole
Copy link
Contributor

Thanks. That seemed to work.

Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Oct 14, 2024
@dashpole
Copy link
Contributor

Fixed by #35366

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants