Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QST]In batch sampler #1100

Open
Tselmeg-C opened this issue Jun 28, 2024 · 1 comment
Open

[QST]In batch sampler #1100

Tselmeg-C opened this issue Jun 28, 2024 · 1 comment
Labels
question Further information is requested

Comments

@Tselmeg-C
Copy link

❓ Questions & Help

Details

image
This InBatchSampler pram is only applicable to TwoTowerModel but not for the DLRM model right? For DLRM I need to feed both positive and negative samples for training and validate, is that correct?

@Tselmeg-C Tselmeg-C added the question Further information is requested label Jun 28, 2024
@CarloNicolini
Copy link

Currently it took me many head-scratches to understand the InBatchSampler.
First of all, it seems that at least with 23.08, the InBatchSamplerV2 is utilized instead of mm.InBatchSampler as the default sampler (in case TwoTowersModelV2 has negative_samplers set to None).
Second, the InBatchSamplerV2 simply yields the batch itself with no differences between positives and negatives, at least on the target point of view.

I would like someone more expert than me help me understand the main logic behind the negative sampling strategies, and how to apply popularity bias correction to the InBatchSamplerV2 without incurring into a handful of errors.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants