Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How is the positional embedding implemented? #24

Open
YifanDengWHU opened this issue Aug 21, 2021 · 3 comments
Open

How is the positional embedding implemented? #24

YifanDengWHU opened this issue Aug 21, 2021 · 3 comments

Comments

@YifanDengWHU
Copy link

Hi, I want to ask how is positional embedding implemented in the model?
xyz = self.pos_xyz(xyz) but there isn't self.pos_xyz given

@amiltonwong
Copy link

It seems no positional encoding is implemented in this PCT model...

@MenghaoGuo
Copy link
Owner

self.pos_xyz can be a simple MLP networks e.g., two fully-connected layers.

@marshzero
Copy link

I wonder if the self.pos_xyz on 165 line same as self.conv_pos on 146 line in /networks/cls/pct.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants