You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to use Polars to access a parquet file stored in DigitalOcean Spaces, that is a S3 compatible storage.
It works with the boto3 package, but I can't make it work with Polars.
I have set access_key_id and secret_access_key in ~/.aws/credentials.
I can list contents in the bucket with boto3.
In the Spaces I have a bucket called mybucket containing a file called test.parquet. (Apparently the aws_region should be fixed to us-east-1 for DigitalOcean.)
Description
I'm trying to use Polars to access a parquet file stored in DigitalOcean Spaces, that is a S3 compatible storage.
It works with the boto3 package, but I can't make it work with Polars.
I have set
access_key_id
andsecret_access_key
in~/.aws/credentials
.I can list contents in the bucket with boto3.
Note that the
endpoint_url
is specified.In the Spaces I have a bucket called
mybucket
containing a file calledtest.parquet
. (Apparently theaws_region
should be fixed tous-east-1
for DigitalOcean.)I get an error
If I specify the bucket more elaborately to be
I get a different error suggesting that the endpoint is hard coded to
s3.amazonaws.com
.The text was updated successfully, but these errors were encountered: