We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
One big takeaway is that connection should not be bound to Blob and Bucket (#728).
connection
Blob
Bucket
If the Bucket does not explicitly have the batch set as it's connection, the request happens outside the batch:
batch
>>> from gcloud import storage >>> from gcloud.storage.batch import Batch >>> storage._PROJECT_ENV_VAR_NAME = 'GCLOUD_TESTS_PROJECT_ID' >>> storage.set_defaults() >>> bucket_name = 'dsmlmsldfsacjnajdnkewee' >>> connection = storage.get_default_connection() >>> bucket = storage.Bucket(name=bucket_name, connection=connection) >>> blob = storage.Blob('foo', bucket=bucket) >>> blob._reload_properties() <Blob: dsmlmsldfsacjnajdnkewee, foo> >>> blob.content_type = 'foo/bar' >>> >>> with Batch() as batch: ... blob.patch() ... <Blob: dsmlmsldfsacjnajdnkewee, foo> Traceback (most recent call last): File "<stdin>", line 2, in <module> File "gcloud/storage/batch.py", line 165, in __exit__ self.finish() File "gcloud/storage/batch.py", line 130, in finish raise ValueError("No deferred requests") ValueError: No deferred requests
On the other hand, if we do the setup correctly, it puts the blob in an undefined state (I referenced this some time ago):
blob
>>> from gcloud import storage >>> from gcloud.storage.batch import Batch >>> storage._PROJECT_ENV_VAR_NAME = 'GCLOUD_TESTS_PROJECT_ID' >>> storage.set_defaults() >>> >>> with Batch() as batch: ... bucket_name = 'dsmlmsldfsacjnajdnkewee' ... bucket = storage.Bucket(name=bucket_name, connection=batch) ... blob = storage.Blob('foo', bucket=bucket) ... blob.content_type = 'foo/bar' ... blob.patch() ... <Blob: dsmlmsldfsacjnajdnkewee, foo> >>> blob._properties '' >>> blob.content_type Traceback (most recent call last): File "<stdin>", line 1, in <module> File "gcloud/storage/_helpers.py", line 163, in _getter return self.properties[fieldname] File "gcloud/storage/_helpers.py", line 64, in properties return self._properties.copy() AttributeError: 'str' object has no attribute 'copy'
The text was updated successfully, but these errors were encountered:
This was fixed in #812
Sorry, something went wrong.
feat: Update Compute Engine API to revision 20221011 (#736) (#355)
7d30915
* feat: Update Compute Engine API to revision 20221011 (#736) Source-Link: googleapis/googleapis@cb8c914 Source-Link: googleapis/googleapis-gen@1174767 Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiMTE3NDc2NzQ0NmM2NTAyNzFmMzk3ZWM2ZWY1NDEwMWJlNDRjZGNiNCJ9 * 🦉 Updates from OwlBot post-processor See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
dhermes
No branches or pull requests
One big takeaway is that
connection
should not be bound toBlob
andBucket
(#728).If the
Bucket
does not explicitly have thebatch
set as it's connection, the request happens outside the batch:On the other hand, if we do the setup correctly, it puts the
blob
in an undefined state (I referenced this some time ago):The text was updated successfully, but these errors were encountered: