Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Train] Fix prepare_data_loader with enable_reproducibility #30266

Merged

Conversation

Yard1
Copy link
Member

@Yard1 Yard1 commented Nov 14, 2022

Signed-off-by: Antoni Baum [email protected]

Why are these changes needed?

Calling train.torch.enable_reproducibility before train.torch.prepare_data_loader causes an exception to be raised of the num_workers in DataLoader is bigger than 0 and the worker_init_fn in DataLoader is not set. The exception is caused by the worker_init_fn, which has a value of None, being used as a callable in seeded_worker_init_fn. This was untested.

This PR fixes this oversight and ensures that this is tested in CI (and also removes a duplicate test in the process).

Related issue number

Closes #30247

Checks

  • I've signed off every commit(by using the -s flag, i.e., git commit -s) in this PR.
  • I've run scripts/format.sh to lint the changes in this PR.
  • I've included any doc changes needed for https://docs.ray.io/en/master/.
  • I've made sure the tests are passing. Note that there might be a few flaky tests, see the recent failures at https://flakey-tests.ray.io/
  • Testing Strategy
    • Unit tests
    • Release tests
    • This PR is not tested :(

Copy link
Contributor

@amogkam amogkam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @Yard1!

@amogkam amogkam merged commit 01b8c33 into ray-project:master Nov 15, 2022
@Yard1 Yard1 deleted the fix_prepare_data_loader_reproductible branch November 15, 2022 11:43
WeichenXu123 pushed a commit to WeichenXu123/ray that referenced this pull request Dec 19, 2022
…project#30266)

Calling train.torch.enable_reproducibility before train.torch.prepare_data_loader causes an exception to be raised of the num_workers in DataLoader is bigger than 0 and the worker_init_fn in DataLoader is not set. The exception is caused by the worker_init_fn, which has a value of None, being used as a callable in seeded_worker_init_fn. This was untested.

This PR fixes this oversight and ensures that this is tested in CI (and also removes a duplicate test in the process).

Signed-off-by: Antoni Baum <[email protected]>
Signed-off-by: Weichen Xu <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Air|Tune] enable_reproducibility is not compatible with prepare_data_loader
3 participants