Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

#15: Add storage.batch.Batch #654

Merged
merged 9 commits into from
Feb 26, 2015
Merged

#15: Add storage.batch.Batch #654

merged 9 commits into from
Feb 26, 2015

Conversation

tseaver
Copy link
Contributor

@tseaver tseaver commented Feb 16, 2015

Step #2 in #15 (comment)

@googlebot googlebot added the cla: yes This human has signed the Contributor License Agreement. label Feb 16, 2015
@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 5fed4d2 on tseaver:15-add_storage_batch into * on GoogleCloudPlatform:master*.

@tseaver
Copy link
Contributor Author

tseaver commented Feb 16, 2015

@craigcitro, @dhermes I'm willing to do the work to vendor in and use the apitools batch module, if Craig doesn't think it will have a PyPI release soon. I can also work on getting it up-to-speed in terms of test coverage and Py3k straddling in the canonical repository, if that would help.

@dhermes
Copy link
Contributor

dhermes commented Feb 17, 2015

Should I review this or rely on the vendored in code?

@tseaver
Copy link
Contributor Author

tseaver commented Feb 17, 2015

@dhermes we don't have a vendored-in version of that module yet. My earlier question was whether we should go ahead and vendor that code in, given that it has no coverage and does not straddle Py3k yet.

The amount of code in the new storage/batch.py module which would be replaced if we chose to vendor the apitools module is relatively small (135 out of 544 lines):


# Flatten payload
if six.PY3: # pragma: NO COVER Python3
buf = io.StringIO()

This comment was marked as spam.

This comment was marked as spam.

This comment was marked as spam.

This comment was marked as spam.

@coveralls
Copy link

Coverage Status

Coverage remained the same at 100.0% when pulling ae0e9cf on tseaver:15-add_storage_batch into 8e68a20 on GoogleCloudPlatform:master.

@dhermes dhermes added the api: storage Issues related to the Cloud Storage API. label Feb 18, 2015
@coveralls
Copy link

Coverage Status

Coverage remained the same at 100.0% when pulling 6261272 on tseaver:15-add_storage_batch into 8e68a20 on GoogleCloudPlatform:master.

headers['Content-Length'] = len(body)
if body is None:
body = ''
lines = ['%s %s HTTP/1.1' % (method, uri)]

This comment was marked as spam.

This comment was marked as spam.

This comment was marked as spam.

This comment was marked as spam.

This comment was marked as spam.

@coveralls
Copy link

Coverage Status

Coverage remained the same at 100.0% when pulling af2a382 on tseaver:15-add_storage_batch into 8e68a20 on GoogleCloudPlatform:master.

A batch proxies a connection, deferring write requests.
In preparation for making 'storage.batch.Batch' derive from Connection,
allowing it to override only the actual transmission of the HTTP request.
Drop patching the connection's 'http', as well as proxying its attributes
(we get that via subclassing).
@tseaver
Copy link
Contributor Author

tseaver commented Feb 25, 2015

@dhermes PTAL.

Rebased on top of master, squashing all previous commits; then implemented subclassing per our discussion today.

lines.append('')
lines.append(body)
payload = '\r\n'.join(lines)
if sys.version_info[0] < 3: # pragma: NO COVER Python2

This comment was marked as spam.

This comment was marked as spam.

@craigcitro
Copy link
Contributor

one quick note: @dhermes was asking about batch+upload. at least according to the docs, this is not supported:
https://cloud.google.com/storage/docs/json_api/v1/how-tos/batch

(last line in "overview")

and ``content`` (a string).
:returns: The HTTP response object and the content of the response.
"""
if method == 'GET':

This comment was marked as spam.

The batching interface is not specific to the storage API.
:raises: ValueError if no requests have been deferred.
"""
if len(self._requests) == 0:
raise ValueError("No deferred requests")

This comment was marked as spam.

This comment was marked as spam.

This comment was marked as spam.

@dhermes
Copy link
Contributor

dhermes commented Feb 26, 2015

RE: #654 (comment)

@craigcitro can we help get apitools:

  • In PyPI
  • Supporting Py3
  • Merging upstream changes from places like gsutil

@@ -52,15 +53,16 @@ def setUp(self):
self.case_buckets_to_delete = []

def tearDown(self):
for bucket in self.case_buckets_to_delete:
bucket.delete()
with Batch(CONNECTION) as batch:

This comment was marked as spam.

This comment was marked as spam.

This comment was marked as spam.

@craigcitro
Copy link
Contributor

replying to myself for posterity: i believe neither uploads nor downloads work in a batch.

@craigcitro
Copy link
Contributor

@dhermes yes! so here's the current apitools plan:

  • i'm working on merging the gsutil fork today
  • once i do that, i'm going to submit to pypi
  • after that, help getting it py3 ready would be AWESOME
  • even right now, py3 support for protorpc would be super

The batching interface is not specific to the storage API.
It does not case-normalize header key lookup, but stores headers only
as lowercase.
@dhermes
Copy link
Contributor

dhermes commented Feb 26, 2015

For posterity (RE: Craig's comment):

Note: Currently, Google Cloud Storage does not support batch operations for
media, either for upload or download.

screen_shot_074

@coveralls
Copy link

Coverage Status

Coverage remained the same at 100.0% when pulling de6fd48 on tseaver:15-add_storage_batch into 808c38b on GoogleCloudPlatform:master.

@dhermes
Copy link
Contributor

dhermes commented Feb 26, 2015

I think everything looks good except removing that unused sys import.

LGTM

@dhermes
Copy link
Contributor

dhermes commented Feb 26, 2015

@craigcitro Can we start a new issue to discuss apitools / protorpc? Should the issue be here or in one (or both) of those repos?

@craigcitro
Copy link
Contributor

@dhermes yeah, new issue wherever sgtm. maybe a "py3 support" issue in each of protorpc/apitools, and an issue here for "better apitools dependency" blocked by those two?

@coveralls
Copy link

Coverage Status

Coverage remained the same at 100.0% when pulling c828bda on tseaver:15-add_storage_batch into d43c4a0 on GoogleCloudPlatform:master.

tseaver added a commit that referenced this pull request Feb 26, 2015
@tseaver tseaver merged commit f0a1157 into googleapis:master Feb 26, 2015
@tseaver tseaver deleted the 15-add_storage_batch branch February 26, 2015 19:10
@dhermes
Copy link
Contributor

dhermes commented Apr 9, 2015

@tseaver We discussed using the apitools batch support when this went in. We should discuss if / how that's possible (in light of #811).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: storage Issues related to the Cloud Storage API. cla: yes This human has signed the Contributor License Agreement.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants