-
-
Notifications
You must be signed in to change notification settings - Fork 864
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AWS S3 Frankfurt region not working #28
Comments
s3 boto user can find a temporary workaround here: boto/boto#2741 EDIT: here is a sample workaround import os
from storages.backends.s3boto import S3BotoStorage
os.environ['S3_USE_SIGV4'] = 'True'
class S3Storage(S3BotoStorage):
@property
def connection(self):
if self._connection is None:
self._connection = self.connection_class(
self.access_key, self.secret_key,
calling_format=self.calling_format, host='s3.eu-central-1.amazonaws.com')
return self._connection |
I can confirm that the above workaround works for Frankfurt. Big thanks @FPurchess ! |
I can confirm this, too! Thank you!! |
You can just add this to the settings:
(Note: I'm using the bitbucket repo) |
@BodhiSukha thanks for that, going to close now. If you notice any regression in moving from the BitBucket repo to this one please open an issue. |
@BodhiSukha
|
@satyrius sorry just saw you comment. I'm using it in 2 different setups
and
works fine on both of them. Unfortunately I'm not sure about your problem. Only other thing different in my settings is the calling format. In python 2.7.9 environment I use this extra setting
|
When I try using this workaround, I get
|
Funny this has started to throw 403:Forbidden all of a sudden. Only when requesting from localhost though. Might have something to do with http / https? #EDIT# |
@miracle2k Make sure you're providing the modified Storage Class in This works for me: import os
from storages.backends.s3boto import S3BotoStorage
os.environ['S3_USE_SIGV4'] = 'True'
class S3Storage(S3BotoStorage):
@property
def connection(self):
if self._connection is None:
self._connection = self.connection_class(
self.access_key, self.secret_key,
calling_format=self.calling_format, host='s3.eu-central-1.amazonaws.com')
return self._connection
STATICFILES_STORAGE = 'MyProject.settings.S3Storage' |
I have checked the issues about Signature 4 with boto for Frankfurt region. I have had same problem with Seoul Frankfurt. I know that both Seoul and Frankfurt regions only support Signture 4 only. |
@somacci solution by @FPurchess works. Try it. |
Thanks @Leistungsabfall! |
Could we re-open this issue please ? It's not solved, we just have a work-around. |
I had the same issue and fixed it by using S3Boto3Storage instead of S3BotoStorage and two not documented parameters in my settings.py. requirements.txt:
settings.py:
To be honest, I didn't set the STATICFILES_STORAGE setting, because my application uses the S3Boto3Storage class for specific tasks, not for all static files. But I think it should work this way. I stumbled upon the not documented options by going through the source code of django-storages. Hope this will be a solution for others too. |
@sdeleeuw 's work around works beautifully. |
In my case for Frankfurt region it was enough to only add these two lines to settings (no need to override S3BotoStorage) Tested with |
I experienced an issue with using the London region ( |
I'm going to reopen this until I can get it added to the docs. |
I used the method by @sdeleeuw, and at least it does not throw any errors anymore. However, when I upload a File, it does not create the file in S3. I create an IAM user and use the credentials from the user. Also, I added the user as a principal
And CORS:
Any ideas? |
So I solved my problem my switching to:
Not sure what the problem was, but I am using eu-central-1. Maybe that is the issue, but anyway, seems like its fixed. |
@oesah I think that s3boto3 handles this automagically but the same doesn't play out for s3boto. I have opened #335 which includes documentation for the fixes discovered in this thread and changes the default signature version so no one will be bit by this in the future. Thanks for your patience and your discoveries everyone. If you see anything wrong in #335 please say something. I'm going to merge it in the morning and immediately cut 1.6 afterwards. |
I'm getting :
with this code:
Can anyone see the issue here? I'm trying for london, I got the s3 host from here EDIT : I updated boto but it went to a different python environment, updated the correct boto and now my code above works. |
This also appears to happen in the 'Canada (Central)' region. Was getting a Edit: Workaround worked great for Canada region. |
Adding
|
Just my 2 cents here, I might be missing something but it seems like boto reads Can someone confirm this? If this is true docs need to be updated. |
Setting the region name to one of the following also does the job because of this
|
While trying to crawl dandiarchive bucket with authentication, to fetch also files which are not publicly available, I have ran into <Error><Code>InvalidRequest</Code><Message>The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.< for which discussion was ongoing in 2017: jschneier/django-storages#28 . A workaround which worked for me was to specify host option to boto.connect_s3 to point to the specific region. So with this fix now it would be possible to use it in the provider configuration, e.g. [provider:dandi-s3] url_re = s3://dandiarchive($|/.*) credential = dandi-s3-backup authentication_type = aws-s3 aws-s3_host = s3.us-east-2.amazonaws.com There might be other options we might want to add later on, so I did not store host in the attribute, but right within the dictionary of optional kwargs for connect_s3.
While trying to crawl dandiarchive bucket with authentication, to fetch also files which are not publicly available, I have ran into <Error><Code>InvalidRequest</Code><Message>The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.< for which discussion was ongoing in 2017: jschneier/django-storages#28 . A workaround which worked for me was to specify host option to boto.connect_s3 to point to the specific region. So with this fix now it would be possible to use it in the provider configuration, e.g. [provider:dandi-s3] url_re = s3://dandiarchive($|/.*) credential = dandi-s3-backup authentication_type = aws-s3 aws-s3_host = s3.us-east-2.amazonaws.com There might be other options we might want to add later on, so I did not store host in the attribute, but right within the dictionary of optional kwargs for connect_s3.
Saved my day. Also using eu-central-1 |
COPY OF BITBUCKET ISSUE #214 - https://bitbucket.org/david/django-storages/issue/214/aws-s3-frankfurt-region-not-working
"Andreas Schilling created an issue 2015-01-04
Using Frankfurt region (Germany) with django-storages produces HTTP 400 error. S3 in the the new region supports only Signature Version 4. In all other regions, Amazon S3 supports both Signature Version 4 and Signature Version 2.
I assume django-storages only supports Signature Version 2. Is there any chance to support Version 4?"
Thanks @jschneier for the fork! Is there a chance for django-storages-redux to support the eu-central-1 region?
The text was updated successfully, but these errors were encountered: