Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Errno 2] No such file or directory: './data/COCO/annotations/instances_valminusminival2014.json' #13

Open
Zyt-wenwne opened this issue Apr 25, 2021 · 5 comments

Comments

@Zyt-wenwne
Copy link

I run the python data/split_coco_dataset_voc_nonvoc.py ,and I get this problem, how can I solve it.

Split 1: 346721 anns; save to ./data/COCO/annotations/split_voc_instances_train2014.json
Split 2: 258186 anns; save to ./data/COCO/annotations/split_nonvoc_instances_train2014.json
processing dataset ./data/COCO/annotations/instances_valminusminival2014.json
Traceback (most recent call last):
File "data/split_coco_dataset_voc_nonvoc.py", line 87, in
split_dataset(dataset_prefix + s, voc_inds, split1_prefix + s, split2_prefix + s)
File "data/split_coco_dataset_voc_nonvoc.py", line 16, in split_dataset
with open(dataset_file) as f:
FileNotFoundError: [Errno 2] No such file or directory: './data/COCO/annotations/instances_valminusminival2014.json'

@Ze-Yang
Copy link
Owner

Ze-Yang commented Apr 25, 2021

You should first download the COCO 2014 dataset here, and place them accordingly under the instructions I provide in the README.md file. This error indicates that you did not correctly set up the dataset.

@Zyt-wenwne
Copy link
Author

thanks for your reply, I have downloaded the COCO 2014 dataset

@Zyt-wenwne
Copy link
Author

You should first download the COCO 2014 dataset here, and place them accordingly under the instructions I provide in the README.md file. This error indicates that you did not correctly set up the dataset.

a problem had occured when I run the:
python train.py --save-folder weights/VOC_split1_pretrain -d VOC -p 1 -max 50000 --steps 30000 40000 --checkpoint-period 4000 --warmup-iter 1000 --setting incre --split 1

the code of python train.py --save-folder weights/COCO60_pretrain -d COCO -p 1 had finished

[04/27 08:26:11 Context-Transformer]: Starting training from iteration 0
Traceback (most recent call last):
File "train.py", line 300, in
train(model, args.resume)
File "train.py", line 220, in train
data, targets = next(data_loader)
File "/home/plc319/anaconda3/envs/ct/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 345, in next
data = self._next_data()
File "/home/plc319/anaconda3/envs/ct/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 856, in _next_data
return self._process_data(data)
File "/home/plc319/anaconda3/envs/ct/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 881, in _process_data
data.reraise()
File "/home/plc319/anaconda3/envs/ct/lib/python3.6/site-packages/torch/_utils.py", line 394, in reraise
raise self.exc_type(msg)
AttributeError: Caught AttributeError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "/home/plc319/anaconda3/envs/ct/lib/python3.6/site-packages/torch/utils/data/_utils/worker.py", line 178, in _worker_loop
data = fetcher.fetch(index)
File "/home/plc319/anaconda3/envs/ct/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/plc319/anaconda3/envs/ct/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/plc319/桌面/Context-Transformer-master/data/voc0712.py", line 235, in getitem
img1, target1 = self.preproc(img1, target1)
File "/home/plc319/桌面/Context-Transformer-master/data/data_augment.py", line 178, in call
image_o = image.copy()
AttributeError: 'NoneType' object has no attribute 'copy'

@Ze-Yang
Copy link
Owner

Ze-Yang commented Jun 10, 2021

I have no idea what the problem is with the information you provide. By the way, you may try to avoid using Chinese characters in the directory path since it may sometimes cause some weird errors that are hard to be noticed. Hope it helps. Thanks.

@alphacyp
Copy link

alphacyp commented Dec 7, 2021

You should first download the COCO 2014 dataset here, and place them accordingly under the instructions I provide in the README.md file. This error indicates that you did not correctly set up the dataset.

a problem had occured when I run the: python train.py --save-folder weights/VOC_split1_pretrain -d VOC -p 1 -max 50000 --steps 30000 40000 --checkpoint-period 4000 --warmup-iter 1000 --setting incre --split 1

the code of python train.py --save-folder weights/COCO60_pretrain -d COCO -p 1 had finished

[04/27 08:26:11 Context-Transformer]: Starting training from iteration 0 Traceback (most recent call last): File "train.py", line 300, in train(model, args.resume) File "train.py", line 220, in train data, targets = next(data_loader) File "/home/plc319/anaconda3/envs/ct/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 345, in next data = self._next_data() File "/home/plc319/anaconda3/envs/ct/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 856, in _next_data return self._process_data(data) File "/home/plc319/anaconda3/envs/ct/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 881, in _process_data data.reraise() File "/home/plc319/anaconda3/envs/ct/lib/python3.6/site-packages/torch/_utils.py", line 394, in reraise raise self.exc_type(msg) AttributeError: Caught AttributeError in DataLoader worker process 0. Original Traceback (most recent call last): File "/home/plc319/anaconda3/envs/ct/lib/python3.6/site-packages/torch/utils/data/_utils/worker.py", line 178, in _worker_loop data = fetcher.fetch(index) File "/home/plc319/anaconda3/envs/ct/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/plc319/anaconda3/envs/ct/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/plc319/桌面/Context-Transformer-master/data/voc0712.py", line 235, in getitem img1, target1 = self.preproc(img1, target1) File "/home/plc319/桌面/Context-Transformer-master/data/data_augment.py", line 178, in call image_o = image.copy() AttributeError: 'NoneType' object has no attribute 'copy'

Has your problem been solved? I have encountered the same problem as you. Can you tell me how you solved it? Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants