Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: The size of tensor a (173) must match the size of tensor b (160) at non-singleton dimension 0 #29

Closed
berylyellow opened this issue Aug 27, 2021 · 6 comments

Comments

@berylyellow
Copy link

想问一下,我自己写了veri wild 的数据读入部分,我想直接用在veri数据集上训练的模型测试一下在veri wild上的结果,然后报错了,请问还要修改哪里呢?
完整的报错信息是这样的:
Traceback (most recent call last): File "test.py", line 53, in model.load_param(cfg.TEST.WEIGHT) File "E:\Win_Pycharm_Project\TransReID-main\TransReID-main\model\make_model.py", line 376, in load_param self.state_dict()[i.replace('module.', '')].copy_(param_dict[i]) RuntimeError: The size of tensor a (173) must match the size of tensor b (160) at non-singleton dimension 0

麻烦大佬了~~~

@heshuting555
Copy link
Collaborator

是不是fc的大小不一致,可以把parameter的key打印出来就知道是哪个地方出错啦

@YuzhouPeng
Copy link

想问一下,我自己写了veri wild 的数据读入部分,我想直接用在veri数据集上训练的模型测试一下在veri wild上的结果,然后报错了,请问还要修改哪里呢? 完整的报错信息是这样的: Traceback (most recent call last): File "test.py", line 53, in model.load_param(cfg.TEST.WEIGHT) File "E:\Win_Pycharm_Project\TransReID-main\TransReID-main\model\make_model.py", line 376, in load_param self.state_dict()[i.replace('module.', '')].copy_(param_dict[i]) RuntimeError: The size of tensor a (173) must match the size of tensor b (160) at non-singleton dimension 0

麻烦大佬了~~~

请问你解决了吗?我这边也遇到类似错误

@hamid-mp
Copy link

Hi, Thank you for sharing the code
I have same problem
here is part of defaults.py config:

_C.MODEL.NAME = 'transformer'
_C.MODEL.LAST_STRIDE = 1
_C.MODEL.PRETRAIN_PATH = './jx_vit_base_p16_224-80ecf9dd.pth'
_C.MODEL.PRETRAIN_CHOICE = 'imagenet'
...
...
_C.MODEL.TRANSFORMER_TYPE = 'vit_base_patch16_224_TransReID'
....
....
_C.TEST.WEIGHT = "./vit_transreid_vehicleID.pth"


I want to use your pretrained model on Vehicle ReID datasets, so I'll be gratefull if you mention where I'm wrong. Thanks

@kp97524
Copy link

kp97524 commented Nov 14, 2023

Hi all ,

Did you figure out the fix for this issue? Am facing the same issue and there's no update for a while :(

Github issue: #70

@KingH12138
Copy link

image
Obviously,there will be some problems in "base.pos_embed" if you use vit_base to do inference on market1501.
em.....let me try to dealing with it first.Authors may take wrong weight file in readme.md's table.

@KingH12138
Copy link

KingH12138 commented Oct 21, 2024

image Obviously,there will be some problems in "base.pos_embed" if you use vit_base to do inference on market1501. em.....let me try to dealing with it first.Authors may take wrong weight file in readme.md's table.

Alright, I can only use a kind of stupied method......First,this is kind of mismatch between model and loaded model(aka: weight).
Then,I rewrite the load_param function in codes/TransReID-main/model/make_model.py as followed:
`def load_param(self, trained_path):

    param_dict = torch.load(trained_path)

    for i in param_dict:

        # print(self.state_dict()[i.replace('module.', '')].shape,param_dict[i].shape)

        cur_model_key = i.replace('module.', '')

        if cur_model_key in self.state_dict():

            self.state_dict()[i.replace('module.', '')].copy_(param_dict[i])

        else:

            print(f"Warning:{cur_model_key} not found in current model but in loaded dictionary.")

    print('Loading pretrained model from {}'.format(trained_path))`

The order of this function is to match the "params" that can be matched and jump over that can't be matched.
This is my config:
`MODEL:
PRETRAIN_CHOICE: 'imagenet'
PRETRAIN_PATH: '/data/jhb_data/checkpoints/jx_vit_base_p16_224-80ecf9dd.pth'
METRIC_LOSS_TYPE: 'triplet'
IF_LABELSMOOTH: 'off'
IF_WITH_CENTER: 'no'
NAME: 'transformer'
NO_MARGIN: True
DEVICE_ID: ('2')
TRANSFORMER_TYPE: 'vit_base_patch16_224_TransReID'
STRIDE_SIZE: [12, 12]
SIE_CAMERA: True
SIE_COE: 3.0
JPM: False
RE_ARRANGE: True

INPUT:
SIZE_TRAIN: [256, 128]
SIZE_TEST: [256, 128]
PROB: 0.5 # random horizontal flip
RE_PROB: 0.5 # random erasing
PADDING: 10
PIXEL_MEAN: [0.5, 0.5, 0.5]
PIXEL_STD: [0.5, 0.5, 0.5]

DATASETS:
NAMES: ('market1501')
ROOT_DIR: ('/data/jhb_data/datasets')

DATALOADER:
SAMPLER: 'softmax_triplet'
NUM_INSTANCE: 4
NUM_WORKERS: 8

SOLVER:
OPTIMIZER_NAME: 'SGD'
MAX_EPOCHS: 120
BASE_LR: 0.008
IMS_PER_BATCH: 64
WARMUP_METHOD: 'linear'
LARGE_FC_LR: False
CHECKPOINT_PERIOD: 120
LOG_PERIOD: 50
EVAL_PERIOD: 120
WEIGHT_DECAY: 1e-4
WEIGHT_DECAY_BIAS: 1e-4
BIAS_LR_FACTOR: 2

TEST:
EVAL: True
IMS_PER_BATCH: 256
RE_RANKING: False
WEIGHT: '/data/jhb_data/checkpoints/vit_transreid_market.pth'
NECK_FEAT: 'before'
FEAT_NORM: 'yes'

OUTPUT_DIR: '../logs/market_vit_transreid_stride'

2024-10-21 07:45:40,986 transreid INFO: Running with config:
DATALOADER:
NUM_INSTANCE: 4
NUM_WORKERS: 8
SAMPLER: softmax_triplet
DATASETS:
NAMES: market1501
ROOT_DIR: /data/jhb_data/datasets
INPUT:
PADDING: 10
PIXEL_MEAN: [0.5, 0.5, 0.5]
PIXEL_STD: [0.5, 0.5, 0.5]
PROB: 0.5
RE_PROB: 0.5
SIZE_TEST: [256, 128]
SIZE_TRAIN: [256, 128]
MODEL:
ATT_DROP_RATE: 0.0
COS_LAYER: False
DEVICE: cuda
DEVICE_ID: 2
DEVIDE_LENGTH: 4
DIST_TRAIN: False
DROP_OUT: 0.0
DROP_PATH: 0.1
ID_LOSS_TYPE: softmax
ID_LOSS_WEIGHT: 1.0
IF_LABELSMOOTH: off
IF_WITH_CENTER: no
JPM: False
LAST_STRIDE: 1
METRIC_LOSS_TYPE: triplet
NAME: transformer
NECK: bnneck
NO_MARGIN: True
PRETRAIN_CHOICE: imagenet
PRETRAIN_PATH: /data/jhb_data/checkpoints/jx_vit_base_p16_224-80ecf9dd.pth
RE_ARRANGE: True
SHIFT_NUM: 5
SHUFFLE_GROUP: 2
SIE_CAMERA: True
SIE_COE: 3.0
SIE_VIEW: False
STRIDE_SIZE: [12, 12]
TRANSFORMER_TYPE: vit_base_patch16_224_TransReID
TRIPLET_LOSS_WEIGHT: 1.0
OUTPUT_DIR: ../logs/market_vit_transreid_stride
SOLVER:
BASE_LR: 0.008
BIAS_LR_FACTOR: 2
CENTER_LOSS_WEIGHT: 0.0005
CENTER_LR: 0.5
CHECKPOINT_PERIOD: 120
COSINE_MARGIN: 0.5
COSINE_SCALE: 30
EVAL_PERIOD: 120
GAMMA: 0.1
IMS_PER_BATCH: 64
LARGE_FC_LR: False
LOG_PERIOD: 50
MARGIN: 0.3
MAX_EPOCHS: 120
MOMENTUM: 0.9
OPTIMIZER_NAME: SGD
SEED: 1234
STEPS: (40, 70)
WARMUP_EPOCHS: 5
WARMUP_FACTOR: 0.01
WARMUP_METHOD: linear
WEIGHT_DECAY: 0.0001
WEIGHT_DECAY_BIAS: 0.0001
TEST:
DIST_MAT: dist_mat.npy
EVAL: True
FEAT_NORM: yes
IMS_PER_BATCH: 256
NECK_FEAT: before
RE_RANKING: False
WEIGHT: /data/jhb_data/checkpoints/vit_transreid_market.pth
=> Market1501 loaded
Dataset statistics:

subset | # ids | # images | # cameras

train | 751 | 12936 | 6
query | 750 | 3368 | 6
gallery | 751 | 15913 | 6

using Transformer_type: vit_base_patch16_224_TransReID as a backbone
using stride: [12, 12], and patch number is num_y21 * num_x10
camera number is : 6
using SIE_Lambda is : 3.0
using drop_out rate is : 0.0
using attn_drop_out rate is : 0.0
using drop_path rate is : 0.1`

And thera are so many the mismatched:

Warning:b1.0.norm1.weight not found in current model but in loaded dictionary. Warning:b1.0.norm1.bias not found in current model but in loaded dictionary. Warning:b1.0.attn.qkv.weight not found in current model but in loaded dictionary. Warning:b1.0.attn.qkv.bias not found in current model but in loaded dictionary. Warning:b1.0.attn.proj.weight not found in current model but in loaded dictionary. Warning:b1.0.attn.proj.bias not found in current model but in loaded dictionary. Warning:b1.0.norm2.weight not found in current model but in loaded dictionary. Warning:b1.0.norm2.bias not found in current model but in loaded dictionary. Warning:b1.0.mlp.fc1.weight not found in current model but in loaded dictionary. Warning:b1.0.mlp.fc1.bias not found in current model but in loaded dictionary. Warning:b1.0.mlp.fc2.weight not found in current model but in loaded dictionary. Warning:b1.0.mlp.fc2.bias not found in current model but in loaded dictionary. Warning:b1.1.weight not found in current model but in loaded dictionary. Warning:b1.1.bias not found in current model but in loaded dictionary. Warning:b2.0.norm1.weight not found in current model but in loaded dictionary. Warning:b2.0.norm1.bias not found in current model but in loaded dictionary. Warning:b2.0.attn.qkv.weight not found in current model but in loaded dictionary. Warning:b2.0.attn.qkv.bias not found in current model but in loaded dictionary. Warning:b2.0.attn.proj.weight not found in current model but in loaded dictionary. Warning:b2.0.attn.proj.bias not found in current model but in loaded dictionary. Warning:b2.0.norm2.weight not found in current model but in loaded dictionary. Warning:b2.0.norm2.bias not found in current model but in loaded dictionary. Warning:b2.0.mlp.fc1.weight not found in current model but in loaded dictionary. Warning:b2.0.mlp.fc1.bias not found in current model but in loaded dictionary. Warning:b2.0.mlp.fc2.weight not found in current model but in loaded dictionary. Warning:b2.0.mlp.fc2.bias not found in current model but in loaded dictionary. Warning:b2.1.weight not found in current model but in loaded dictionary. Warning:b2.1.bias not found in current model but in loaded dictionary. Warning:classifier_1.weight not found in current model but in loaded dictionary. Warning:classifier_2.weight not found in current model but in loaded dictionary. Warning:classifier_3.weight not found in current model but in loaded dictionary. Warning:classifier_4.weight not found in current model but in loaded dictionary. Warning:bottleneck_1.weight not found in current model but in loaded dictionary. Warning:bottleneck_1.bias not found in current model but in loaded dictionary. Warning:bottleneck_1.running_mean not found in current model but in loaded dictionary. Warning:bottleneck_1.running_var not found in current model but in loaded dictionary. Warning:bottleneck_1.num_batches_tracked not found in current model but in loaded dictionary. Warning:bottleneck_2.weight not found in current model but in loaded dictionary. Warning:bottleneck_2.bias not found in current model but in loaded dictionary. Warning:bottleneck_2.running_mean not found in current model but in loaded dictionary. Warning:bottleneck_2.running_var not found in current model but in loaded dictionary. Warning:bottleneck_2.num_batches_tracked not found in current model but in loaded dictionary. Warning:bottleneck_3.weight not found in current model but in loaded dictionary. Warning:bottleneck_3.bias not found in current model but in loaded dictionary. Warning:bottleneck_3.running_mean not found in current model but in loaded dictionary. Warning:bottleneck_3.running_var not found in current model but in loaded dictionary. Warning:bottleneck_3.num_batches_tracked not found in current model but in loaded dictionary. Warning:bottleneck_4.weight not found in current model but in loaded dictionary. Warning:bottleneck_4.bias not found in current model but in loaded dictionary. Warning:bottleneck_4.running_mean not found in current model but in loaded dictionary. Warning:bottleneck_4.running_var not found in current model but in loaded dictionary. Warning:bottleneck_4.num_batches_tracked not found in current model but in loaded dictionary.

However,in the end,I got the nice rank test as following:

image
My output is very similar to authors'.Well,is it a kind of luck?Wish my answer can help you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants