-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
#15: Add storage.batch.Batch
#654
Conversation
Changes Unknown when pulling 5fed4d2 on tseaver:15-add_storage_batch into * on GoogleCloudPlatform:master*. |
@craigcitro, @dhermes I'm willing to do the work to vendor in and use the apitools |
Should I review this or rely on the vendored in code? |
@dhermes we don't have a vendored-in version of that module yet. My earlier question was whether we should go ahead and vendor that code in, given that it has no coverage and does not straddle Py3k yet. The amount of code in the new |
|
||
# Flatten payload | ||
if six.PY3: # pragma: NO COVER Python3 | ||
buf = io.StringIO() |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
headers['Content-Length'] = len(body) | ||
if body is None: | ||
body = '' | ||
lines = ['%s %s HTTP/1.1' % (method, uri)] |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
A batch proxies a connection, deferring write requests.
In preparation for making 'storage.batch.Batch' derive from Connection, allowing it to override only the actual transmission of the HTTP request.
Drop patching the connection's 'http', as well as proxying its attributes (we get that via subclassing).
@dhermes PTAL. Rebased on top of master, squashing all previous commits; then implemented subclassing per our discussion today. |
lines.append('') | ||
lines.append(body) | ||
payload = '\r\n'.join(lines) | ||
if sys.version_info[0] < 3: # pragma: NO COVER Python2 |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
one quick note: @dhermes was asking about batch+upload. at least according to the docs, this is not supported: (last line in "overview") |
and ``content`` (a string). | ||
:returns: The HTTP response object and the content of the response. | ||
""" | ||
if method == 'GET': |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
The batching interface is not specific to the storage API.
:raises: ValueError if no requests have been deferred. | ||
""" | ||
if len(self._requests) == 0: | ||
raise ValueError("No deferred requests") |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
RE: #654 (comment) @craigcitro can we help get
|
@@ -52,15 +53,16 @@ def setUp(self): | |||
self.case_buckets_to_delete = [] | |||
|
|||
def tearDown(self): | |||
for bucket in self.case_buckets_to_delete: | |||
bucket.delete() | |||
with Batch(CONNECTION) as batch: |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
replying to myself for posterity: i believe neither uploads nor downloads work in a batch. |
@dhermes yes! so here's the current apitools plan:
|
The batching interface is not specific to the storage API.
It does not case-normalize header key lookup, but stores headers only as lowercase.
For posterity (RE: Craig's comment):
|
I think everything looks good except removing that unused LGTM |
@craigcitro Can we start a new issue to discuss |
@dhermes yeah, new issue wherever sgtm. maybe a "py3 support" issue in each of protorpc/apitools, and an issue here for "better apitools dependency" blocked by those two? |
#15: Add `storage.batch.Batch`
Step #2 in #15 (comment)