-
Notifications
You must be signed in to change notification settings - Fork 508
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: consistent type embedding #3617
Conversation
Signed-off-by: Jinzhe Zeng <[email protected]>
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## devel #3617 +/- ##
==========================================
+ Coverage 77.70% 77.90% +0.19%
==========================================
Files 434 402 -32
Lines 37541 32821 -4720
Branches 1623 909 -714
==========================================
- Hits 29170 25568 -3602
+ Misses 7507 6725 -782
+ Partials 864 528 -336 ☔ View full report in Codecov by Sentry. |
Signed-off-by: Jinzhe Zeng <[email protected]>
Signed-off-by: Jinzhe Zeng <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems that by this PR, when neuron == []
, the embedding is equivalent to identity, rather than a linear mapping. It seems that the implementation should be via the FittingNet
not the EmbeddingNet
.
Please check if I am wrong.
Signed-off-by: Jinzhe Zeng <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree with this PR but with one comment, will we keep bias
for TypeEmbedNet
in each layer? Because it may be confusing for some one-hot analysis on embedding weights, such as interpolation on different elements and etc.
Keeping and removing We should not remove the bias if we do not fix the activation function to linear. The configuration of the type embedding may need further discussion, i.e., whether we allow flexible configurations for the type embedding. |
No description provided.