You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Changefeed send messages to the Kafka server, if the message size is larger than the Kafka's configuration max.message.bytes, an error will be reported to the changefeed and it get stuck.
It's unexpected that an occasional large message cause the whole changefeed cannot make progress. So we propose that, if the message size is large than the message limit, only encode the primary key or the not null unique key of the event, and add a new flag to indicate this. The Kafka consumer once receive such an message, they should use the PK / UK to query the upstream TiDB to get the corresponding data.
Background
Changefeed send messages to the Kafka server, if the message size is larger than the Kafka's configuration
max.message.bytes
, an error will be reported to the changefeed and it get stuck.It's unexpected that an occasional large message cause the whole changefeed cannot make progress. So we propose that, if the message size is large than the message limit, only encode the primary key or the not null unique key of the event, and add a new flag to indicate this. The Kafka consumer once receive such an message, they should use the PK / UK to query the upstream TiDB to get the corresponding data.
Development
Docs
The text was updated successfully, but these errors were encountered: