Description
BackGround
Our usage scenario is that there are about a dozen tables in our database, but we only pay attention to five changes. The amount of changes in a table other than the following five tables far exceeds the five we are concerned about. Therefore, the memory consumption of our service is very high, but the CPU utilization is very low. In addition, the synchronization lag of these five tables has also become significantly larger.
Thoughts
I feel that it should be caused by the continuous creation of buffer and decoding operations when receiving changes to that table. Do we have a mechanism to judge that the received event is not what we care about, so we don't continue to decode it and don't transmit it to the user? If not, do you think it would make sense if it was implemented as a feature? I can do it if I can. grateful!