Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AWS S3 Frankfurt region not working #28

Closed
FPurchess opened this issue Feb 15, 2015 · 28 comments
Closed

AWS S3 Frankfurt region not working #28

FPurchess opened this issue Feb 15, 2015 · 28 comments
Labels

Comments

@FPurchess
Copy link

COPY OF BITBUCKET ISSUE #214 - https://bitbucket.org/david/django-storages/issue/214/aws-s3-frankfurt-region-not-working

"Andreas Schilling created an issue 2015-01-04

Using Frankfurt region (Germany) with django-storages produces HTTP 400 error. S3 in the the new region supports only Signature Version 4. In all other regions, Amazon S3 supports both Signature Version 4 and Signature Version 2.

I assume django-storages only supports Signature Version 2. Is there any chance to support Version 4?"

Thanks @jschneier for the fork! Is there a chance for django-storages-redux to support the eu-central-1 region?

@FPurchess
Copy link
Author

s3 boto user can find a temporary workaround here: boto/boto#2741

EDIT: here is a sample workaround

import os
from storages.backends.s3boto import S3BotoStorage

os.environ['S3_USE_SIGV4'] = 'True'

class S3Storage(S3BotoStorage):
    @property
    def connection(self):
        if self._connection is None:
            self._connection = self.connection_class(
                self.access_key, self.secret_key,
                calling_format=self.calling_format, host='s3.eu-central-1.amazonaws.com')
        return self._connection

@frisellcpl
Copy link

I can confirm that the above workaround works for Frankfurt.

Big thanks @FPurchess !

@rotherfuchs
Copy link

I can confirm this, too! Thank you!!

@BodhiSukha
Copy link

You can just add this to the settings:

os.environ['S3_USE_SIGV4'] = 'True'
AWS_S3_HOST = 's3.eu-central-1.amazonaws.com'

(Note: I'm using the bitbucket repo)

@jschneier
Copy link
Owner

@BodhiSukha thanks for that, going to close now. If you notice any regression in moving from the BitBucket repo to this one please open an issue.

@satyrius
Copy link

@BodhiSukha AWS_S3_HOST = 's3.eu-central-1.amazonaws.com' does not work for me. Any idea why? Hacking class as @FPurchess works fine.

boto==2.36.0
django-storages==1.1.8

@BodhiSukha
Copy link

@satyrius sorry just saw you comment.

I'm using it in 2 different setups

boto==2.34.0 
django-storages==1.1.8

and

boto==2.36.0 
django-storages==1.1.8

works fine on both of them. Unfortunately I'm not sure about your problem. Only other thing different in my settings is the calling format. In python 2.7.9 environment I use this extra setting

AWS_S3_CALLING_FORMAT = boto.s3.connection.OrdinaryCallingFormat()

@miracle2k
Copy link

When I try using this workaround, I get HostRequiredError: BotoClientError: When using SigV4, you must specify a 'host' parameter..

django-storages==1.1.8
boto==2.38.0

@frisellcpl
Copy link

Funny this has started to throw 403:Forbidden all of a sudden. Only when requesting from localhost though. Might have something to do with http / https?

#EDIT#
This had to do with my local timesettings. Synced to ntp and everything worked as expected. I blame friday afternoon. And apologies for the stupid comment.

@Leistungsabfall
Copy link

@miracle2k Make sure you're providing the modified Storage Class in STATICFILES_STORAGE and not the boto default one.

This works for me:

import os
from storages.backends.s3boto import S3BotoStorage

os.environ['S3_USE_SIGV4'] = 'True'

class S3Storage(S3BotoStorage):
    @property
    def connection(self):
        if self._connection is None:
            self._connection = self.connection_class(
                self.access_key, self.secret_key,
                calling_format=self.calling_format, host='s3.eu-central-1.amazonaws.com')
        return self._connection

STATICFILES_STORAGE = 'MyProject.settings.S3Storage'

@somacci
Copy link

somacci commented Feb 18, 2016

I have checked the issues about Signature 4 with boto for Frankfurt region. I have had same problem with Seoul Frankfurt. I know that both Seoul and Frankfurt regions only support Signture 4 only.
So, is this closed or not. If it is closed, how to solve it. Please let me know.

@galuszkak
Copy link

@somacci solution by @FPurchess works. Try it.

@lusentis
Copy link

Thanks @Leistungsabfall!
The 400 Bad Request exception persisted until I updated to boto==2.38.0 (I was running 2.22).

@hugodes
Copy link

hugodes commented Nov 27, 2016

Could we re-open this issue please ? It's not solved, we just have a work-around.
Suport for s3v4 auth is really important for all who are deploying outside the U.S.

@sdeleeuw
Copy link

sdeleeuw commented Dec 8, 2016

I had the same issue and fixed it by using S3Boto3Storage instead of S3BotoStorage and two not documented parameters in my settings.py.

requirements.txt:

  • boto3==1.4.1
  • django-storages==1.5.1

settings.py:

  • STATICFILES_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
  • AWS_ACCESS_KEY_ID = ''
  • AWS_SECRET_ACCESS_KEY = ''
  • AWS_STORAGE_BUCKET_NAME = ''
  • AWS_S3_REGION_NAME = 'eu-central-1' (not documented)
  • AWS_S3_SIGNATURE_VERSION = 's3v4' (not documented)
  • AWS_QUERYSTRING_AUTH = False
  • AWS_S3_FILE_OVERWRITE = True

To be honest, I didn't set the STATICFILES_STORAGE setting, because my application uses the S3Boto3Storage class for specific tasks, not for all static files. But I think it should work this way.

I stumbled upon the not documented options by going through the source code of django-storages.

Hope this will be a solution for others too.

@nehaljwani
Copy link

@sdeleeuw 's work around works beautifully.

@jjanczyszyn
Copy link

jjanczyszyn commented Mar 24, 2017

In my case for Frankfurt region it was enough to only add these two lines to settings (no need to override S3BotoStorage)
S3_USE_SIGV4 = True
AWS_S3_HOST = 's3.eu-central-1.amazonaws.com'

Tested with
django-storages==1.5.2
boto==2.46.1

@BigglesZX
Copy link

I experienced an issue with using the London region (eu-west-2) and @tramwaj29's solution fixed it for me, substituting the region name in the value of AWS_S3_HOST.

@jschneier
Copy link
Owner

I'm going to reopen this until I can get it added to the docs.

@oesah
Copy link

oesah commented Jun 18, 2017

I used the method by @sdeleeuw, and at least it does not throw any errors anymore. However, when I upload a File, it does not create the file in S3. I create an IAM user and use the credentials from the user. Also, I added the user as a principal

{
    "Version": "2008-10-17",
    "Statement": [
        {
            "Sid": "PublicReadForGetBucketObjects",
            "Effect": "Allow",
            "Principal": {
                "AWS": "*"
            },
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::bucket/media/*"
        },
        {
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::123456789:user/myuser"
            },
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::bucket/media",
                "arn:aws:s3:::bucket/media/*"
            ]
        }
    ]
}

And CORS:

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
    <CORSRule>
        <AllowedOrigin>*</AllowedOrigin>
        <AllowedMethod>GET</AllowedMethod>
        <AllowedMethod>HEAD</AllowedMethod>
        <MaxAgeSeconds>3000</MaxAgeSeconds>
        <AllowedHeader>Authorization</AllowedHeader>
    </CORSRule>
</CORSConfiguration>

Any ideas?

@oesah
Copy link

oesah commented Jun 18, 2017

So I solved my problem my switching to:

boto3==1.4.4
django-storages==1.5.2

Not sure what the problem was, but I am using eu-central-1. Maybe that is the issue, but anyway, seems like its fixed.

@jschneier
Copy link
Owner

jschneier commented Jun 21, 2017

@oesah I think that s3boto3 handles this automagically but the same doesn't play out for s3boto. I have opened #335 which includes documentation for the fixes discovered in this thread and changes the default signature version so no one will be bit by this in the future. Thanks for your patience and your discoveries everyone. If you see anything wrong in #335 please say something. I'm going to merge it in the morning and immediately cut 1.6 afterwards.

@Tllew
Copy link

Tllew commented May 24, 2018

I'm getting :

<?xml version="1.0" encoding="UTF-8"?>
<Error>
    <Code>
        InvalidRequest
    </Code>
    <Message>
        The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.
    </Message>
    <RequestId>548992053424FAF0</RequestId> . 
    <HostId>
        bE/0RI6Wn5J41Qkf/kKRnA8A+uLuy9N1C6wyGsptJeqqOqmyaV+JCbXbj8hzHI7Bp1PtFOo897k= . 
    </HostId>
</Error>

with this code:

AWS_S3_HOST = 's3.eu-west-2.amazonaws.com'
s3 = boto.connect_s3(settings.S3_CALLS_KEY, settings.S3_CALLS_SECRET, host=AWS_S3_HOST)
bucket = s3.get_bucket(settings.S3_CALLS_BUCKET)

Can anyone see the issue here?

I'm trying for london, I got the s3 host from here

EDIT : I updated boto but it went to a different python environment, updated the correct boto and now my code above works.

@coler-j
Copy link

coler-j commented Sep 14, 2018

This also appears to happen in the 'Canada (Central)' region. Was getting a Cross-Origin Read Blocking (CORB) blocked cross-origin (error was caused because the response was just error XML and content type xml instead of the expected image content type) in the browser until I changed region. Going to implement workaround by @sdeleeuw

Edit: Workaround worked great for Canada region.

@vnikolayev1
Copy link

vnikolayev1 commented Apr 26, 2019

Adding
AWS_S3_REGION_NAME = 'eu-central-1'
fixed it for me. I guess you just need to set your region, and you good to go.
Full code:

AWS_SECRET_ACCESS_KEY = ""
AWS_STORAGE_BUCKET_NAME = ""
AWS_S3_REGION_NAME = 'eu-central-1'

AWS_S3_FILE_OVERWRITE = False
AWS_DEFAULT_ACL = None

DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'

@filipeximenes
Copy link

Just my 2 cents here, I might be missing something but it seems like boto reads S3_USE_SIGV4 straight from the env vars and thus setting it in settings.py has no effect whatsoever. See here:
https://github.com/boto/boto/blob/03b2268348ea81d80e3e5ddea0970f4968561010/boto/auth.py#L1067

Can someone confirm this? If this is true docs need to be updated.

@filipeximenes
Copy link

Setting the region name to one of the following also does the job because of this

SIGV4_DETECT = [
    '.cn-',
    # In eu-central and ap-northeast-2 we support both host styles for S3
    '.eu-central',
    '-eu-central',
    '.ap-northeast-2',
    '-ap-northeast-2',
    '.ap-south-1',
    '-ap-south-1',
    '.us-east-2',
    '-us-east-2',
    '-ca-central',
    '.ca-central',
    '.eu-west-2',
    '-eu-west-2',
]```

yarikoptic added a commit to yarikoptic/datalad that referenced this issue Mar 3, 2020
While trying to crawl  dandiarchive  bucket with authentication, to fetch also
files which are not publicly available, I have ran into

  <Error><Code>InvalidRequest</Code><Message>The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.<

for which discussion was ongoing in 2017: jschneier/django-storages#28 .
A workaround which worked for me was to specify host option to boto.connect_s3 to
point to the specific region.  So with this fix now it would be possible to use it
in the provider configuration, e.g.

	[provider:dandi-s3]
	url_re = s3://dandiarchive($|/.*)
	credential = dandi-s3-backup
	authentication_type = aws-s3
	aws-s3_host = s3.us-east-2.amazonaws.com

There might be other options we might want to add later on, so I did not
store host in the attribute, but right within the dictionary of optional
kwargs for connect_s3.
yarikoptic added a commit to yarikoptic/datalad that referenced this issue Mar 3, 2020
While trying to crawl  dandiarchive  bucket with authentication, to fetch also
files which are not publicly available, I have ran into

  <Error><Code>InvalidRequest</Code><Message>The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.<

for which discussion was ongoing in 2017: jschneier/django-storages#28 .
A workaround which worked for me was to specify host option to boto.connect_s3 to
point to the specific region.  So with this fix now it would be possible to use it
in the provider configuration, e.g.

	[provider:dandi-s3]
	url_re = s3://dandiarchive($|/.*)
	credential = dandi-s3-backup
	authentication_type = aws-s3
	aws-s3_host = s3.us-east-2.amazonaws.com

There might be other options we might want to add later on, so I did not
store host in the attribute, but right within the dictionary of optional
kwargs for connect_s3.
@ikahitin
Copy link

So I solved my problem my switching to:

boto3==1.4.4
django-storages==1.5.2

Not sure what the problem was, but I am using eu-central-1. Maybe that is the issue, but anyway, seems like its fixed.

Saved my day. Also using eu-central-1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests