-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
关于带位置关系的自注意力 #23
Comments
但是我看dino的DINOTransformerDecoder没有skip_relation这个参数 |
Hi @mitu752 位置关系是在训练验证的时候都会使用。具体来说,我们的模型是有两个分支:主分支和辅助分支。skip_relation参数用于判断是否使用位置关系,主分支使用位置关系,skip_relation=True;辅助分支不使用,skip_relation=False。且辅助分支只在训练时候使用,用于加快收敛。问题中截出来的这部分是辅助分支。 DINO只有一个主分支,所以不需要skip_relation来区分。 |
大佬,还有个问题请教下,我看代码在验证过程中没有传入attn_mask, |
attn_mask是query之间的self-attention mask; |
Question
大佬,您好,我想问下这个位置关系只在验证的时候使用吗
来自models/bricks/relation_transformer.py的
if self.training:
hybrid_classes, hybrid_coords = self.decoder(
query=hybrid_target,
value=memory,
key_padding_mask=mask_flatten,
reference_points=hybrid_reference_points,
spatial_shapes=spatial_shapes,
level_start_index=level_start_index,
valid_ratios=valid_ratios,
skip_relation=True,
)
补充信息
No response
The text was updated successfully, but these errors were encountered: