Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

reparametrization/marginalization for feature mask #44

Open
DimGorr opened this issue Jul 26, 2024 · 0 comments
Open

reparametrization/marginalization for feature mask #44

DimGorr opened this issue Jul 26, 2024 · 0 comments

Comments

@DimGorr
Copy link

DimGorr commented Jul 26, 2024

Hi!
I have a question about feature selection part. In the article claim that you in order to solve the potential issue with importnat features whose values are zeros you use 1) Monte Carlo Estimate to sample feature subset and then you use 2) reparametrization

So the questions I have are:
a) I'm struggling to find the implementation of 1). Could you maybe help me with that? And if it's not implemented would it be a problem for important features that are equal to zeros?
b) I found the reparametrization part bu it's set by default to be False (the variable marginalize below; the code is taken from class ExplainModule(nn.Module), function forward). And I don't see any places where this marginalization would be set to be True. Is it just beacause you forgot to change it back after some testing or does reparametrization worsen the results?

if marginalize:
    std_tensor = torch.ones_like(x, dtype=torch.float) / 2
    mean_tensor = torch.zeros_like(x, dtype=torch.float) - x
    z = torch.normal(mean=mean_tensor, std=std_tensor)
    x = x + z * (1 - feat_mask)

c) If you know that does pytorch geometric has exactly the same implementation as here?

I'm actually asking this to find out if pytorch geometric could mistakenly show that some feature is unimportant if it was equal to zero:) Thank you in advance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant