-
Notifications
You must be signed in to change notification settings - Fork 139
test: prevent alchemy test suite from being rate-limited. #1009
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…um rate of table metadata update operations per table". The rate limit is set to a maximum of 5 operations per 10 seconds. By introducing a sleep of 2 seconds before each operation, this function helps to ensure that the rate limit is not exceeded.
""" | ||
Adds a delay between operations to prevent exceeding the "Maximum rate of table metadata update operations per table". | ||
|
||
The rate limit is set to a maximum of 5 operations per 10 seconds. By introducing a sleep of 2 seconds |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm somewhat surprised that our exponential backoff query job retry logic doesn't handle this already. Do we have a link to a failing test stacktrace?
Instead of sleep, I'd like for us to add the rate limit as an allowed job retry reason in
if we know that the error is safe to retry.
If we do go this route, let's add a link to https://cloud.google.com/bigquery/quotas#standard_tables where this limit is defined.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, it would be better handled in the python-bigquery
code.. The work around I suggested is a quite quick and dirty solution..
Here's a stacktrace of the error https://source.cloud.google.com/results/invocations/ffafb866-6bc0-423f-a86b-df69fb270d57/log.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've filed googleapis/python-bigquery#1790 as the preferred approach to fixing this problem.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great! Thanks for taking the time to describe the issue.
Let's see if we close this one or want to keep a workaround in the meantime.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am inclined to see if we can get this fixed.
Will work through things on the Issue Tim created...googleapis/python-bigquery#1790
This is now obsolete, as @kiraksi has resolved the underlying retry strategy issue in |
Checking the error message in the Kokoro Prerelease Dependencies check in #936, it appears that the alembic system test suite is beeing rate-limited after exceeding the DDL operations quota (https://cloud.google.com/bigquery/quotas#standard_tables).
I suggest we could add a delay to prevent the test from being rate-limited. This is not very elegant but it should not have too much impacts on the overall delay of the test suite.
We could also introduce a more relevant retry-strategy that would requires some more work.
Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly:
Fixes #<issue_number_goes_here> 🦕