Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confused about the ENT and AdENT #8

Open
ustcjerry opened this issue Aug 24, 2020 · 3 comments
Open

Confused about the ENT and AdENT #8

ustcjerry opened this issue Aug 24, 2020 · 3 comments

Comments

@ustcjerry
Copy link

SSDA_MME/main.py

Lines 195 to 204 in 81c3a9c

if args.method == 'ENT':
loss_t = entropy(F1, output, args.lamda)
loss_t.backward()
optimizer_f.step()
optimizer_g.step()
elif args.method == 'MME':
loss_t = adentropy(F1, output, args.lamda)
loss_t.backward()
optimizer_f.step()
optimizer_g.step()

SSDA_MME/utils/loss.py

Lines 28 to 41 in 81c3a9c

def entropy(F1, feat, lamda, eta=1.0):
out_t1 = F1(feat, reverse=True, eta=-eta)
out_t1 = F.softmax(out_t1)
loss_ent = -lamda * torch.mean(torch.sum(out_t1 *
(torch.log(out_t1 + 1e-5)), 1))
return loss_ent
def adentropy(F1, feat, lamda, eta=1.0):
out_t1 = F1(feat, reverse=True, eta=eta)
out_t1 = F.softmax(out_t1)
loss_adent = lamda * torch.mean(torch.sum(out_t1 *
(torch.log(out_t1 + 1e-5)), 1))
return loss_adent

Thank you for your code.
From your code it seems that
ENT method try to minimize entropy on classifier but maximize on feature extractor;
AdENT method try to maximize entropy on classifier but minimize on feature extractor, which is proposed in your paper.

BUT, in your paper the ENT method seems to be described as minimize entropy on both classifier and feature extractor, as referred in Yves Grandvalet and Yoshua Bengio. Semi-supervised learning by entropy minimization. In NIPS, 2005

So, i'm very confused about it. I'm looking forward to hearing from you.

@ustcjerry
Copy link
Author

Oops, it seems that i missed the parameter eta. Sorry ...

But would it be more clear that set the reverse=False in function entropy with positive eta?

@ZHAO-Fengnian
Copy link

Oops, it seems that i missed the parameter eta. Sorry ...

But would it be more clear that set the reverse=False in function entropy with positive eta?

Dear author,
when I use adentropy I got the loss_adent below zero. It seems that you forgot to add a minus before it. I am not using your experiments directly so I don't know if it is right in your case. But the out_t1 is [0,1], so the log actually produces a negative number, so the loss_adent is definitely negative, right? I really look forward to your reply.

@ZHAO-Fengnian
Copy link

Oops, it seems that i missed the parameter eta. Sorry ...
But would it be more clear that set the reverse=False in function entropy with positive eta?

Dear author,
when I use adentropy I got the loss_adent below zero. It seems that you forgot to add a minus before it. I am not using your experiments directly so I don't know if it is right in your case. But the out_t1 is [0,1], so the log actually produces a negative number, so the loss_adent is definitely negative, right? I really look forward to your reply.

I knew that the loss should be negative

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants