-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Closed
Labels
Description
Important Details
How are you running Sentry?
- On-Premise docker [Version 10]
- Saas (sentry.io)
- Other [briefly describe your environment]
Description
Ran into this bug: getsentry/sentry#17422
Now that it has been fixed thi shappens:
Applying sentry.0024_auto_20191230_2052...Events to process: 22961
06:54:05 [WARNING] sentry.eventstream.kafka.backend: Could not publish message (error: KafkaError{code=MSG_SIZE_TOO_LARGE,val=10,str="Broker: Message size too large"}): <cimpl.Message object at 0x7fd6a4ab35d0>
06:54:05 [WARNING] sentry.eventstream.kafka.backend: Could not publish message (error: KafkaError{code=MSG_SIZE_TOO_LARGE,val=10,str="Broker: Message size too large"}): <cimpl.Message object at 0x7fd6a4ab35d0>
The same Kafka-error shows up many times and after quite a while it will continue:
Event migration done. Migrated 22961 of 22961 events.
- and then eventually finish migrations.
Steps to Reproduce
- git checkout master --force
- git pull
- mv sentry/config.example.yml sentry/config.yml (insert security key)
- ./install.sh
- Wait
There are no special configurations. Dataset is in an existing postgres 9.6 container.
What you expected to happen
No errors
Possible Solution
Maybe some limits in terms of message size limit should be altered somewhere?
MRigal