Skip to content
This repository has been archived by the owner on Dec 13, 2020. It is now read-only.

KeyError when uploading Large files #109

Open
djhopper01 opened this issue Dec 5, 2016 · 0 comments
Open

KeyError when uploading Large files #109

djhopper01 opened this issue Dec 5, 2016 · 0 comments

Comments

@djhopper01
Copy link
Contributor

Thanks for building this. It has really helped us a ton.

I'm not a Python developer, so bear with me :)

We're using turbolift to upload backups to Rackspace. Recently, our backups have exceeded 5GB.

In an attempt to fix it, we upgraded to turbolift 3 since it supports large files. We've been using the archive method, but we're aware it's not yet ready in turbolift 3 (#72).

While testing the upload method, we ran into this error:

  File "/home/deploy/turbolift/turbolift/clouderator/actions.py", line 605, in put_object
    local_object=local_object
  File "/home/deploy/turbolift/turbolift/clouderator/utils.py", line 46, in f_retry
    return f(*args, **kwargs)
  File "/home/deploy/turbolift/turbolift/clouderator/actions.py", line 174, in _putter
    manifest = headers.pop('X-Object-Manifest')
KeyError: 'X-Object-Manifest'
Done.

I did some investigation and it looks like it tries to pop 'X-Object-Manifest' when it no longer exists. If you make a shallow copy of the headers, it seems to work better.

    # In turbolift/clouderator/actions.py
    def _putter(self, uri, headers, local_object=None):
            ...
            if os.path.getsize(local_object) > large_object_size:
                headers = headers.copy()
                # Remove the manifest entry while working with chunks
                manifest = headers.pop('X-Object-Manifest')
            ...
    

If this is the right way to handle it, let me know and I can submit a pull request. Or if this isn't an issue and I'm simply doing something wrong, let me know :)

Also, we're running turbolift on a 1GB digital ocean instance. We ran into memory issues when using the default upload options. Just to help those who read this issue, you can specify --chunk-size 268435456 to upload your chunks in 256MB increments.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant