-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add resource leak hero scenario. #455
Add resource leak hero scenario. #455
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some minimal adjusts, but it looks great!
Co-authored-by: Juliano Costa <julianocosta89@outlook.com>
Yeah -- incidentally I'm not wedded to the growth here (I think it's probably too quick, it takes about 5 minutes at the current random rate + ^2 growth of the list), I might tweak it to ^1.5 or ^1.2 to get more of a stairstep in growth. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add app.cache_size
to list of manual span attributes.
All looking good!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great, works fine!
* add cache leak failure scenario * add attribute to span * update changelog * Update src/recommendationservice/recommendation_server.py Co-authored-by: Juliano Costa <julianocosta89@outlook.com> * add newline * add docs * tweak scenario * add resource limit to force service overlimit * review Co-authored-by: Juliano Costa <julianocosta89@outlook.com>
Changes
Adds a 'hero scenario' of a resource leak to the Recommendation Service. This scenario implements a naive cache, controlled by the
recommendationCache
flag, as well as a newspan.cache_size
span attribute. Other changes include adding a restart policy to the container for this service, as the cache will blow up after a few minutes.For significant contributions please make sure you have completed the following items:
CHANGELOG.md
updated for non-trivial changes