Skip to content
This repository has been archived by the owner on Dec 13, 2020. It is now read-only.

Container Clone - Wrong Metadata Headers in v3 & Memory Issues in v2.1.3 #102

Open
sathyavpk opened this issue Oct 29, 2015 · 0 comments
Open

Comments

@sathyavpk
Copy link

Hello Kevin,

I have tried container clone using turbolift v3.0.0, it was quick and successfully transferred around 4.4 GB data.But it breaks "content-type" header. So as per your suggestion here #101 I tried again using turbolift v2.1.3. But it returns multiple errors like

  1. Something happened with the concurrency I guess, I have tried turbolift from the same cloud server where I first cloned using v3.0.0, but getting "can not allocate memory" error like below.

Getting Object list from the Source.
Processing - [ / ] - Please Wait... INFO: 368083 object(s) found
Beginning Sync Operation.
This will take "13" operations to complete.
Job Count 1
Thread Starting Cycle
Processing - [ | ] - Number of Jobs in Queue = 29991 All Done!
Traceback (most recent call last):
File "/usr/bin/turbolift", line 9, in
load_entry_point('turbolift==2.1.3', 'console_scripts', 'turbolift')()
File "/usr/lib/python2.6/site-packages/turbolift-2.1.3-py2.6.egg/turbolift/executable.py", line 42, in run_turbolift
worker.start_work()
File "/usr/lib/python2.6/site-packages/turbolift-2.1.3-py2.6.egg/turbolift/worker.py", line 46, in start_work
actions(auth=auth.authenticate()).start()
File "/usr/lib/python2.6/site-packages/turbolift-2.1.3-py2.6.egg/turbolift/methods/clone.py", line 135, in start
kwargs=kwargs
File "/usr/lib/python2.6/site-packages/turbolift-2.1.3-py2.6.egg/turbolift/utils/multi_utils.py", line 239, in job_processer
basic.stupid_hack(wait=.2)
File "/usr/lib64/python2.6/contextlib.py", line 34, in exit
self.gen.throw(type, value, traceback)
File "/usr/lib/python2.6/site-packages/turbolift-2.1.3-py2.6.egg/turbolift/utils/multi_utils.py", line 313, in spinner
yield
File "/usr/lib/python2.6/site-packages/turbolift-2.1.3-py2.6.egg/turbolift/utils/multi_utils.py", line 237, in job_processer
kwargs=kwargs
File "/usr/lib/python2.6/site-packages/turbolift-2.1.3-py2.6.egg/turbolift/utils/multi_utils.py", line 146, in worker_proc
_job.start()
File "/usr/lib64/python2.6/multiprocessing/process.py", line 104, in start
self._popen = Popen(self)
File "/usr/lib64/python2.6/multiprocessing/forking.py", line 94, in init
self.pid = os.fork()
OSError: [Errno 12] Cannot allocate memory
Traceback (most recent call last):
File "/usr/lib64/python2.6/multiprocessing/queues.py", line 242, in _feed

  1. I have even tried from a different cloudserver with 8GB ram, but there it showing below errors

Getting Object list from the Source.
Processing - [ / ] - Please Wait... INFO: 368083 object(s) found
Beginning Sync Operation.
This will take "13" operations to complete.
Job Count 1
Processing - [ | ] - Number of Jobs in Queue = 30000 Thread Starting Cycle
Processing - [ | ] - Number of Jobs in Queue = 29802 System Problems Found SWIFT-API FAILURE -> REASON status code 422 REQUEST <PreparedRequest [PUT]>
ADDITIONAL DATA: {'last-modified': 'Fri, 23 Oct 2015 06:32:26 GMT', 'Connection': 'Keep-alive', 'etag': 'b972cca657fb66c17b4a163456d8e0c1', 'x-timestamp': '1445581945.49011', 'accept-ranges': 'bytes', 'X-Auth-Token': u'AAD--h7GR9UmjlDJ2U4n6PMi-pIiXeFs6qXNNXrvw2iMYGQggauHpHTRZ2CQlVF2uUPFzOSqt0wc_4vvXwgAMwHaSv9kj4C-rTvMkKDT_jxKcelCSrl0j7zk3j1RZQHEsb01DCrLUYhXwA', 'content-type': 'image/jpeg', 'User-Agent': 'turbolift'} {u'hash': u'7fb3948979b9ba3a8fa3022775bdb456', u'last_modified': u'2015-10-23T06:30:29.888100', u'bytes': 4191, u'name': u'public/uploads/catalog/product/small/1/8/1851491_1_v2_710.jpg', u'content_type': u'image/jpeg'}

Can you help me to clone this container of 4.9 GB with nearly 400K files to another container without breaking 'content-type header".
Is there any way to control concurrent threads in v2.1.3 or use --clone-headers in v3.0.0 ?

Thanks,
Sathyan

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant