You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I noticed that there are two layers:self. kb and self. kc is not used in your model code , so I annotated them.However, the fscore dropped after annotation. Do you know why?
self.att = SelfAttention(input_size=self.m, output_size=self.m)
self.ka = nn.Linear(in_features=self.m, out_features=1024)
self.kb = nn.Linear(in_features=self.ka.out_features, out_features=1024)
self.kc = nn.Linear(in_features=self.kb.out_features, out_features=1024)
self.kd = nn.Linear(in_features=self.ka.out_features, out_features=1)
The text was updated successfully, but these errors were encountered:
Hi, to be honest I don't know. The layers are, indeed, not used so commenting them out should not affect the
performance. I had tried it last night but the results were ok. But this test was done with different version of python, pytorch
and other libs. Perhaps there is a bug in the pytorch?
Hello, I noticed that there are two layers:self. kb and self. kc is not used in your model code , so I annotated them.However, the fscore dropped after annotation. Do you know why?
self.att = SelfAttention(input_size=self.m, output_size=self.m)
self.ka = nn.Linear(in_features=self.m, out_features=1024)
self.kb = nn.Linear(in_features=self.ka.out_features, out_features=1024)
self.kc = nn.Linear(in_features=self.kb.out_features, out_features=1024)
self.kd = nn.Linear(in_features=self.ka.out_features, out_features=1)
The text was updated successfully, but these errors were encountered: