Skip to content

caching msg.data value #798

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Mar 23, 2020
Merged

caching msg.data value #798

merged 1 commit into from
Mar 23, 2020

Conversation

pierreluctg
Copy link
Collaborator

Avoid calling msg.data multiple times by caching msg.data value

Avoid calling msg.data multiple times by caching msg.data value
@pierreluctg pierreluctg marked this pull request as ready for review March 20, 2020 19:58
Copy link
Collaborator

@felixdivo felixdivo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems to be correct, but I cannot comment on the speedup gained by this.

@codecov
Copy link

codecov bot commented Mar 20, 2020

Codecov Report

Merging #798 into develop will decrease coverage by 0.01%.
The diff coverage is 0.00%.

@@             Coverage Diff             @@
##           develop     #798      +/-   ##
===========================================
- Coverage    69.54%   69.53%   -0.02%     
===========================================
  Files           70       70              
  Lines         6547     6548       +1     
===========================================
  Hits          4553     4553              
- Misses        1994     1995       +1     

@pierreluctg pierreluctg merged commit ffa1800 into develop Mar 23, 2020
@pierreluctg pierreluctg deleted the neovi-msg.data-cache branch March 23, 2020 20:11
@pierreluctg pierreluctg mentioned this pull request Apr 20, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants