Closed
Description
Dear support,
I have recently tried the latest release to evaluate the new collect.pbm metrics, in a docker container, and it crashes as it can't create a PBM client. MongoDB is 7.0.14 in a 3 nodes replicaset.
Containers log and stack trace:
level=info ts=2024-09-26T09:25:44.558Z caller=tls_config.go:313 msg="Listening on" address=[::]:9216
level=info ts=2024-09-26T09:25:44.558Z caller=tls_config.go:316 msg="TLS is disabled." http2=false address=[::]:9216
time="2024-09-26T09:26:04Z" level=error msg="failed to create PBM client from uri mongodb://192.1.1.181:27017: create mongo connection: ping: server selection error: context deadline exceeded, current topology: { Type: ReplicaSetNoPrimary, Servers: [{ Addr: mongo7-1:27017, Type: Unknown, Last error: dial tcp: lookup mongo7-1 on 192.1.0.2:53: no such host }, { Addr: mongo7-2:27017, Type: Unknown, Last error: dial tcp: lookup mongo7-2 on 192.1.0.2:53: no such host }, { Addr: mongo7-3:27017, Type: Unknown, Last error: dial tcp: lookup mongo7-3 on 192.1.0.2:53: no such host }, ] }" collector=pbm
2024/09/26 09:26:10 http: panic serving 172.17.0.1:50608: descriptor Desc{fqName: "collector_scrape_time_ms", help: "Time taken for scrape by collector", constLabels: {collector="profile"}, variableLabels: {exporter}} already exists with the same fully-qualified name and const label values
goroutine 51 [running]:
net/http.(*conn).serve.func1()
/opt/hostedtoolcache/go/1.22.7/x64/src/net/http/server.go:1903 +0xbe
panic({0xd1c980?, 0xc000e6cdb0?})
/opt/hostedtoolcache/go/1.22.7/x64/src/runtime/panic.go:770 +0x132
github.com/prometheus/client_golang/prometheus.(*Registry).MustRegister(...)
/home/runner/go/pkg/mod/github.com/prometheus/client_golang@v1.19.1/prometheus/registry.go:405
github.com/percona/mongodb_exporter/exporter.(*Exporter).makeRegistry(0xc000447260, {0x13934f8, 0xc00043ad20}, 0xc00012c1c0, {0x13901b8, 0xc0004465d0}, {{0xc000447230, 0x3, 0x3}, 0x0, ...})
/home/runner/work/mongodb_exporter/mongodb_exporter/exporter/exporter.go:247 +0xc19
github.com/percona/mongodb_exporter/exporter.RunWebServer.(*Exporter).Handler.func2({0x1392720, 0xc00012c0e0}, 0xc000700240)
/home/runner/work/mongodb_exporter/mongodb_exporter/exporter/exporter.go:376 +0x6cf
net/http.HandlerFunc.ServeHTTP(0x19c6f40?, {0x1392720?, 0xc00012c0e0?}, 0x7fa63a?)
/opt/hostedtoolcache/go/1.22.7/x64/src/net/http/server.go:2171 +0x29
net/http.(*ServeMux).ServeHTTP(0x471859?, {0x1392720, 0xc00012c0e0}, 0xc000700240)
/opt/hostedtoolcache/go/1.22.7/x64/src/net/http/server.go:2688 +0x1ad
net/http.serverHandler.ServeHTTP({0xc000447ec0?}, {0x1392720?, 0xc00012c0e0?}, 0x6?)
/opt/hostedtoolcache/go/1.22.7/x64/src/net/http/server.go:3142 +0x8e
net/http.(*conn).serve(0xc0003c9d40, {0x1393450, 0xc0004475c0})
/opt/hostedtoolcache/go/1.22.7/x64/src/net/http/server.go:2044 +0x5e8
created by net/http.(*Server).Serve in goroutine 26
/opt/hostedtoolcache/go/1.22.7/x64/src/net/http/server.go:3290 +0x4b4
2024/09/26 09:26:10 http: panic serving 172.17.0.1:59336: descriptor Desc{fqName: "collector_scrape_time_ms", help: "Time taken for scrape by collector", constLabels: {collector="profile"}, variableLabels: {exporter}} already exists with the same fully-qualified name and const label values
goroutine 559 [running]:
net/http.(*conn).serve.func1()
/opt/hostedtoolcache/go/1.22.7/x64/src/net/http/server.go:1903 +0xbe
panic({0xd1c980?, 0xc0006fc4e0?})
/opt/hostedtoolcache/go/1.22.7/x64/src/runtime/panic.go:770 +0x132
github.com/prometheus/client_golang/prometheus.(*Registry).MustRegister(...)
/home/runner/go/pkg/mod/github.com/prometheus/client_golang@v1.19.1/prometheus/registry.go:405
github.com/percona/mongodb_exporter/exporter.(*Exporter).makeRegistry(0xc000447260, {0x13934f8, 0xc000e5eee0}, 0xc00012c380, {0x13901b8, 0xc000bb24b0}, {{0xc000447230, 0x3, 0x3}, 0x0, ...})
/home/runner/work/mongodb_exporter/mongodb_exporter/exporter/exporter.go:247 +0xc19
github.com/percona/mongodb_exporter/exporter.RunWebServer.(*Exporter).Handler.func2({0x1392720, 0xc00012c2a0}, 0xc000e885a0)
/home/runner/work/mongodb_exporter/mongodb_exporter/exporter/exporter.go:376 +0x6cf
net/http.HandlerFunc.ServeHTTP(0x19c6f40?, {0x1392720?, 0xc00012c2a0?}, 0x7fa63a?)
/opt/hostedtoolcache/go/1.22.7/x64/src/net/http/server.go:2171 +0x29
net/http.(*ServeMux).ServeHTTP(0x471859?, {0x1392720, 0xc00012c2a0}, 0xc000e885a0)
/opt/hostedtoolcache/go/1.22.7/x64/src/net/http/server.go:2688 +0x1ad
net/http.serverHandler.ServeHTTP({0xc000e67cb0?}, {0x1392720?, 0xc00012c2a0?}, 0x6?)
/opt/hostedtoolcache/go/1.22.7/x64/src/net/http/server.go:3142 +0x8e
net/http.(*conn).serve(0xc000656090, {0x1393450, 0xc0004475c0})
/opt/hostedtoolcache/go/1.22.7/x64/src/net/http/server.go:2044 +0x5e8
created by net/http.(*Server).Serve in goroutine 26
/opt/hostedtoolcache/go/1.22.7/x64/src/net/http/server.go:3290 +0x4b4
Repro
docker run -d --name "percona-mg-exporter" -p 9216:9216 --rm --expose 9216 percona/mongodb_exporter:0.41 --mongodb.uri=mongodb://192.1.1.181:27017 --collect-all --mongodb.collstats-colls=airbnb.calendar,airbnb.listings,airbnb.reviews --collector.pbm
Environment
- Ubuntu 22.04
- Docker 27.3.1
- MongoDB 7.0.14
Thanks,
~David