Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I have googled it, but it has not been solved yet. I have the following problem, please help me #113

Open
HaoWeiHsueh opened this issue May 7, 2022 · 0 comments

Comments

@HaoWeiHsueh
Copy link

I suspect it's the PyTorch version and StackOverflow doesn't have a solution to the problem
1. The following error occurs when I execute python demo.py --gpu 0 --stage param --test_epoch 8

RuntimeError: Error(s) in loading state_dict for DataParallel:
size mismatch for module.param_regressor.fc_pose.0.weight: copying a param with shape torch.Size([144, 512]) from checkpoint, the shape in current model is torch.Size([96, 512])
.
size mismatch for module.param_regressor.fc_pose.0.bias: copying a param with shape torch.Size([144]) from checkpoint, the shape in current model is torch.Size([96]).
size mismatch for module.human_model_layer.th_shapedirs: copying a param with shape torch.Size([6890, 3, 10]) from checkpoint, the shape in current model is torch.Size([778, 3,
10]).
size mismatch for module.human_model_layer.th_posedirs: copying a param with shape torch.Size([6890, 3, 207]) from checkpoint, the shape in current model is torch.Size([778, 3,
135]).
size mismatch for module.human_model_layer.th_v_template: copying a param with shape torch.Size([1, 6890, 3]) from checkpoint, the shape in current model is torch.Size([1, 778,
3]).
size mismatch for module.human_model_layer.th_J_regressor: copying a param with shape torch.Size([24, 6890]) from checkpoint, the shape in current model is torch.Size([16, 778])
.
size mismatch for module.human_model_layer.th_weights: copying a param with shape torch.Size([6890, 24]) from checkpoint, the shape in current model is torch.Size([778, 16]).
size mismatch for module.human_model_layer.th_faces: copying a param with shape torch.Size([13776, 3]) from checkpoint, the shape in current model is torch.Size([1538, 3]).

2.The following error occurs when I execute python demo.py --gpu 0 --stage param --test_epoch 12
Traceback (most recent call last):
File "demo.py", line 99, in
out = model(inputs, targets, meta_info, 'test')
File "E:\Software\Anaconda3\envs\I2L\lib\site-packages\torch\nn\modules\module.py", line 1051, in _call_impl
return forward_call(*input, **kwargs)
File "E:\Software\Anaconda3\envs\I2L\lib\site-packages\torch\nn\parallel\data_parallel.py", line 166, in forward
return self.module(*inputs[0], **kwargs[0])
File "E:\Software\Anaconda3\envs\I2L\lib\site-packages\torch\nn\modules\module.py", line 1051, in _call_impl
return forward_call(*input, **kwargs)
File "..\main\model.py", line 75, in forward
joint_img_from_mesh = torch.bmm(torch.from_numpy(self.joint_regressor).cuda()[None,:,:].repeat(mesh_coord_img.shape[0],1,1), mesh_coord_img)
RuntimeError: batch1 dim 2 must match batch2 dim 1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant