Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A question about the ”appearance code” #55

Closed
BoomShakaY opened this issue Sep 12, 2020 · 3 comments
Closed

A question about the ”appearance code” #55

BoomShakaY opened this issue Sep 12, 2020 · 3 comments

Comments

@BoomShakaY
Copy link

Thanks for your code!
I have a question about how the loss Lprim constrains the encoder to extract the appearance code. If I want to change the loss to extract other kinds of features( eg. weather), where should I modify?

If I change Lprim , then the teacher model is unused.

I have seen in #40 , you said "f is the appearance code for image generation; We do not want the generation losses to update f. Thus, we use the detach here. In this way, f is mainly updated via the re-id related losses."

The re-id related losses including CrossEntropyLoss and KLDivLoss, they are calculated between labels, so I'm confused.

Thank you!

@layumi
Copy link
Contributor

layumi commented Sep 14, 2020

Hi @BoomShakaY

  1. If you have labels for your dataset, you only need to prepare your dataset like Market with $n$ subfolder, and change the class number $n$ in the config file. For example, your dataset contain 300 classes, your dataset folder should contain 300 subfolder.

  2. You need to train your own teacher model to use KLDivLoss.

@BoomShakaY
Copy link
Author

BoomShakaY commented Sep 15, 2020

Thanks for your reply!

I want to extract other kinds of feature based on ReID just like you, yes, I have labels and the class number is only 9, if I change the class_num straightly from 751 to 9, won't it deviate from the original intention?

I just wondering if I add a classifier beside the f_netAB last two classifiers and use another loss to constrain it, will it seem to work well? I'm still trying because I'm a green hand in coding, by the way, thanks for your code again, it really helps a lot!

@layumi
Copy link
Contributor

layumi commented Oct 4, 2020

@BoomShakaY
You mean adding the third classifier? It will work if you set an appropriate loss weight. You may have a try.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants