Skip to content

Conversation

@Gsantomaggio
Copy link
Member

@Gsantomaggio Gsantomaggio commented Jun 27, 2022

Closes #136

Using this gist: https://gist.github.com/Gsantomaggio/cd5b0734225d3fab9ed013d634eaf005 to test the performances

cc @ricsiLT

Test results Producer and Reliable Producer with and without Batch Send :

*****Standard Producer Batch Send***** send time: 00:00:24.3330340, messages sent: 20000000

*****Standard Producer Send***** send time: 00:00:24.6326600, messages sent: 20000000

*****Reliable Producer Batch Send***** time: 00:00:27.2689740, messages sent: 20000000

*****Reliable Producer Send No Batch***** send time: 00:00:31.4620940, messages sent: 20000000

These values are excepted the good news is Reliable Producer Batch Send time is not so far from standard producer with batch send

Signed-off-by: Gabriele Santomaggio <G.santomaggio@gmail.com>
Signed-off-by: Gabriele Santomaggio <G.santomaggio@gmail.com>
@Gsantomaggio Gsantomaggio marked this pull request as draft June 27, 2022 07:25
@codecov
Copy link

codecov bot commented Jun 27, 2022

Codecov Report

Merging #142 (38af58c) into main (86f924d) will decrease coverage by 0.01%.
The diff coverage is 92.70%.

❗ Current head 38af58c differs from pull request most recent head df59028. Consider uploading reports for the commit df59028 to get more accurate results

@@            Coverage Diff             @@
##             main     #142      +/-   ##
==========================================
- Coverage   92.09%   92.07%   -0.02%     
==========================================
  Files          77       77              
  Lines        5855     5968     +113     
  Branches      360      375      +15     
==========================================
+ Hits         5392     5495     +103     
- Misses        374      385      +11     
+ Partials       89       88       -1     
Impacted Files Coverage Δ
...abbitMQ.Stream.Client/Reliable/ReliableProducer.cs 86.82% <83.87%> (-0.86%) ⬇️
RabbitMQ.Stream.Client/Producer.cs 78.43% <86.11%> (+2.86%) ⬆️
RabbitMQ.Stream.Client/Client.cs 91.51% <100.00%> (-1.31%) ⬇️
...abbitMQ.Stream.Client/Reliable/ConfirmationPipe.cs 100.00% <100.00%> (ø)
Tests/ProducerSystemTests.cs 100.00% <100.00%> (ø)
Tests/ReliableTests.cs 100.00% <100.00%> (ø)
RabbitMQ.Stream.Client/WireFormatting.cs 70.50% <0.00%> (-4.32%) ⬇️
RabbitMQ.Stream.Client/MetaData.cs 91.58% <0.00%> (+0.93%) ⬆️
RabbitMQ.Stream.Client/Subscribe.cs 65.21% <0.00%> (+1.08%) ⬆️
... and 2 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 86f924d...df59028. Read the comment docs.

Signed-off-by: Gabriele Santomaggio <G.santomaggio@gmail.com>
Signed-off-by: Gabriele Santomaggio <G.santomaggio@gmail.com>
Signed-off-by: Gabriele Santomaggio <G.santomaggio@gmail.com>
@Gsantomaggio Gsantomaggio marked this pull request as ready for review June 30, 2022 14:51
@Gsantomaggio
Copy link
Member Author

@ricsiLT when you have time, can you please test it?

@ricsiLT
Copy link
Contributor

ricsiLT commented Jul 1, 2022

Just built it, will deploy to syst shortly ^^

Signed-off-by: Gabriele Santomaggio <G.santomaggio@gmail.com>
@Gsantomaggio
Copy link
Member Author

Gsantomaggio commented Jul 1, 2022

@ricsiLT
Copy link
Contributor

ricsiLT commented Jul 1, 2022

Seems to be working fine. Would only be able to test with huge loads next week, but small publishes look ok.

@Gsantomaggio
Copy link
Member Author

Thanks @ricsiLT I will leave the PR open and wait for feedback. No rush.

@ricsiLT
Copy link
Contributor

ricsiLT commented Jul 1, 2022

Hmmm, in the end I did get some errors:

System.InvalidOperationException: Total size of messages in batch is too big. Max allowed is 1048576
   at RabbitMQ.Stream.Client.Producer.PreValidateBatch(List`1 messages)
   at RabbitMQ.Stream.Client.Reliable.ReliableProducer.BatchSend(List`1 messages)

Should this be handled on my end? Or should function split it into multiple batches? How can I change that limit?

Also, good time to look at the messages I send - they shouldn't be so heavy...

@Gsantomaggio
Copy link
Member Author

Should this be handled on my end?

Yes

Or should function split it into multiple batches?

This is what the standard Send does internally so It doesn't make sense to add a split here else we have another Send

How can I change that limit?

It is possible to request a higher value but atm the API is missing. I will add

@Gsantomaggio
Copy link
Member Author

Gsantomaggio commented Jul 1, 2022

How can I change that limit?

well atm it is not possible. Just found a small bug fixed here rabbitmq/rabbitmq-server#5131 :)

@ricsiLT
Copy link
Contributor

ricsiLT commented Jul 15, 2022

Can confirm that sending small batches works, and validation for batches that are too big seems to work as well :D merge it in!

@Gsantomaggio Gsantomaggio merged commit da980eb into main Jul 15, 2022
@Gsantomaggio Gsantomaggio deleted the batch_send branch July 15, 2022 14:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Request for awaitable batch publish

3 participants