-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementing storage regression tests to match gcloud-node. #319
Merged
Merged
Changes from all commits
Commits
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,3 +1,4 @@ | ||
export GCLOUD_TESTS_DATASET_ID="my-dataset" | ||
export GCLOUD_TESTS_PROJECT_ID="my-project" | ||
export GCLOUD_TESTS_DATASET_ID=${GCLOUD_TESTS_PROJECT_ID} | ||
export GCLOUD_TESTS_CLIENT_EMAIL="[email protected]" | ||
export GCLOUD_TESTS_KEY_FILE="path.key" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,247 @@ | ||
from Crypto.Hash import MD5 | ||
import base64 | ||
import httplib2 | ||
import tempfile | ||
import time | ||
import unittest2 | ||
|
||
from gcloud import storage | ||
# This assumes the command is being run via tox hence the | ||
# repository root is the current directory. | ||
from regression import regression_utils | ||
|
||
|
||
HTTP = httplib2.Http() | ||
SHARED_BUCKETS = {} | ||
|
||
|
||
def setUpModule(): | ||
if 'test_bucket' not in SHARED_BUCKETS: | ||
connection = regression_utils.get_storage_connection() | ||
# %d rounds milliseconds to nearest integer. | ||
bucket_name = 'new%d' % (1000 * time.time(),) | ||
# In the **very** rare case the bucket name is reserved, this | ||
# fails with a ConnectionError. | ||
SHARED_BUCKETS['test_bucket'] = connection.create_bucket(bucket_name) | ||
|
||
|
||
def tearDownModule(): | ||
for bucket in SHARED_BUCKETS.values(): | ||
# Passing force=True also deletes all files. | ||
bucket.delete(force=True) | ||
|
||
|
||
class TestStorage(unittest2.TestCase): | ||
|
||
@classmethod | ||
def setUpClass(cls): | ||
cls.connection = regression_utils.get_storage_connection() | ||
|
||
|
||
class TestStorageBuckets(TestStorage): | ||
|
||
def setUp(self): | ||
self.case_buckets_to_delete = [] | ||
|
||
def tearDown(self): | ||
for bucket in self.case_buckets_to_delete: | ||
bucket.delete() | ||
|
||
def test_create_bucket(self): | ||
new_bucket_name = 'a-new-bucket' | ||
self.assertRaises(storage.exceptions.NotFoundError, | ||
self.connection.get_bucket, new_bucket_name) | ||
created = self.connection.create_bucket(new_bucket_name) | ||
self.case_buckets_to_delete.append(created) | ||
self.assertEqual(created.name, new_bucket_name) | ||
|
||
def test_get_buckets(self): | ||
buckets_to_create = [ | ||
'new%d' % (1000 * time.time(),), | ||
'newer%d' % (1000 * time.time(),), | ||
'newest%d' % (1000 * time.time(),), | ||
] | ||
created_buckets = [] | ||
for bucket_name in buckets_to_create: | ||
bucket = self.connection.create_bucket(bucket_name) | ||
self.case_buckets_to_delete.append(bucket) | ||
|
||
# Retrieve the buckets. | ||
all_buckets = self.connection.get_all_buckets() | ||
created_buckets = [bucket for bucket in all_buckets | ||
if bucket.name in buckets_to_create] | ||
self.assertEqual(len(created_buckets), len(buckets_to_create)) | ||
|
||
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
Sorry, something went wrong. |
||
|
||
class TestStorageFiles(TestStorage): | ||
|
||
FILES = { | ||
'logo': { | ||
'path': 'regression/data/CloudPlatform_128px_Retina.png', | ||
}, | ||
'big': { | ||
'path': 'regression/data/five-mb-file.zip', | ||
}, | ||
} | ||
|
||
@staticmethod | ||
def _get_base64_md5hash(filename): | ||
with open(filename, 'rb') as file_obj: | ||
hash = MD5.new(data=file_obj.read()) | ||
digest_bytes = hash.digest() | ||
return base64.b64encode(digest_bytes) | ||
|
||
@classmethod | ||
def setUpClass(cls): | ||
super(TestStorageFiles, cls).setUpClass() | ||
for file_data in cls.FILES.values(): | ||
file_data['hash'] = cls._get_base64_md5hash(file_data['path']) | ||
cls.bucket = SHARED_BUCKETS['test_bucket'] | ||
|
||
def setUp(self): | ||
self.case_keys_to_delete = [] | ||
|
||
def tearDown(self): | ||
for key in self.case_keys_to_delete: | ||
key.delete() | ||
|
||
|
||
class TestStorageWriteFiles(TestStorageFiles): | ||
|
||
def test_large_file_write_from_stream(self): | ||
key = self.bucket.new_key('LargeFile') | ||
self.assertEqual(key.metadata, {}) | ||
|
||
file_data = self.FILES['big'] | ||
with open(file_data['path'], 'rb') as file_obj: | ||
self.bucket.upload_file_object(file_obj, key=key) | ||
self.case_keys_to_delete.append(key) | ||
|
||
key.reload_metadata() | ||
self.assertEqual(key.metadata['md5Hash'], file_data['hash']) | ||
|
||
def test_write_metadata(self): | ||
my_metadata = {'contentType': 'image/png'} | ||
key = self.bucket.upload_file(self.FILES['logo']['path']) | ||
self.case_keys_to_delete.append(key) | ||
|
||
# NOTE: This should not be necessary. We should be able to pass | ||
# it in to upload_file and also to upload_from_string. | ||
key.patch_metadata(my_metadata) | ||
self.assertEqual(key.metadata['contentType'], | ||
my_metadata['contentType']) | ||
|
||
def test_direct_write_and_read_into_file(self): | ||
key = self.bucket.new_key('MyBuffer') | ||
file_contents = 'Hello World' | ||
key.upload_from_string(file_contents) | ||
self.case_keys_to_delete.append(key) | ||
|
||
same_key = self.bucket.new_key('MyBuffer') | ||
temp_filename = tempfile.mktemp() | ||
with open(temp_filename, 'w') as file_obj: | ||
same_key.get_contents_to_file(file_obj) | ||
|
||
with open(temp_filename, 'rb') as file_obj: | ||
stored_contents = file_obj.read() | ||
|
||
self.assertEqual(file_contents, stored_contents) | ||
|
||
def test_copy_existing_file(self): | ||
key = self.bucket.upload_file(self.FILES['logo']['path'], | ||
key='CloudLogo') | ||
self.case_keys_to_delete.append(key) | ||
|
||
new_key = self.bucket.copy_key(key, self.bucket, 'CloudLogoCopy') | ||
self.case_keys_to_delete.append(new_key) | ||
|
||
base_contents = key.get_contents_as_string() | ||
copied_contents = new_key.get_contents_as_string() | ||
self.assertEqual(base_contents, copied_contents) | ||
|
||
|
||
class TestStorageListFiles(TestStorageFiles): | ||
|
||
FILENAMES = ['CloudLogo1', 'CloudLogo2', 'CloudLogo3'] | ||
|
||
@classmethod | ||
def setUpClass(cls): | ||
super(TestStorageListFiles, cls).setUpClass() | ||
# Make sure bucket empty before beginning. | ||
for key in cls.bucket: | ||
key.delete() | ||
|
||
logo_path = cls.FILES['logo']['path'] | ||
key = cls.bucket.upload_file(logo_path, key=cls.FILENAMES[0]) | ||
cls.suite_keys_to_delete = [key] | ||
|
||
# Copy main key onto remaining in FILENAMES. | ||
for filename in cls.FILENAMES[1:]: | ||
new_key = cls.bucket.copy_key(key, cls.bucket, filename) | ||
cls.suite_keys_to_delete.append(new_key) | ||
|
||
@classmethod | ||
def tearDownClass(cls): | ||
for key in cls.suite_keys_to_delete: | ||
key.delete() | ||
|
||
def test_list_files(self): | ||
all_keys = self.bucket.get_all_keys() | ||
self.assertEqual(len(all_keys), len(self.FILENAMES)) | ||
|
||
def test_paginate_files(self): | ||
truncation_size = 1 | ||
extra_params = {'maxResults': len(self.FILENAMES) - truncation_size} | ||
iterator = storage.key._KeyIterator(bucket=self.bucket, | ||
extra_params=extra_params) | ||
response = iterator.get_next_page_response() | ||
keys = list(iterator.get_items_from_response(response)) | ||
self.assertEqual(len(keys), extra_params['maxResults']) | ||
self.assertEqual(iterator.page_number, 1) | ||
self.assertTrue(iterator.next_page_token is not None) | ||
|
||
response = iterator.get_next_page_response() | ||
last_keys = list(iterator.get_items_from_response(response)) | ||
self.assertEqual(len(last_keys), truncation_size) | ||
|
||
|
||
class TestStorageSignURLs(TestStorageFiles): | ||
|
||
def setUp(self): | ||
super(TestStorageSignURLs, self).setUp() | ||
|
||
logo_path = self.FILES['logo']['path'] | ||
with open(logo_path, 'r') as file_obj: | ||
self.LOCAL_FILE = file_obj.read() | ||
|
||
key = self.bucket.new_key('LogoToSign.jpg') | ||
key.upload_from_string(self.LOCAL_FILE) | ||
self.case_keys_to_delete.append(key) | ||
|
||
def tearDown(self): | ||
for key in self.case_keys_to_delete: | ||
if key.exists(): | ||
key.delete() | ||
|
||
def test_create_signed_read_url(self): | ||
key = self.bucket.new_key('LogoToSign.jpg') | ||
expiration = int(time.time() + 5) | ||
signed_url = key.generate_signed_url(expiration, method='GET') | ||
|
||
response, content = HTTP.request(signed_url, method='GET') | ||
self.assertEqual(response.status, 200) | ||
self.assertEqual(content, self.LOCAL_FILE) | ||
|
||
def test_create_signed_delete_url(self): | ||
key = self.bucket.new_key('LogoToSign.jpg') | ||
expiration = int(time.time() + 283473274) | ||
signed_delete_url = key.generate_signed_url(expiration, | ||
method='DELETE') | ||
|
||
response, content = HTTP.request(signed_delete_url, method='DELETE') | ||
self.assertEqual(response.status, 204) | ||
self.assertEqual(content, '') | ||
|
||
# Check that the key has actually been deleted. | ||
self.assertRaises(storage.exceptions.NotFoundError, | ||
key.reload_metadata) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
Sorry, something went wrong.