-
Notifications
You must be signed in to change notification settings - Fork 356
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: only train for one batch in PyTorch Trainer test mode #8260
fix: only train for one batch in PyTorch Trainer test mode #8260
Conversation
✅ Deploy Preview for determined-ui ready!
To edit notification comments on pull requests, go to your Netlify site configuration. |
if self.local_training: | ||
searcher_length = self.max_length | ||
train_length = Batch(1) if self.test_mode else self.max_length | ||
else: | ||
searcher_length = TrainUnit._from_searcher_unit( | ||
train_length = TrainUnit._from_searcher_unit( | ||
op.length, self.searcher_unit, self.global_batch_size | ||
) | ||
assert searcher_length | ||
assert train_length |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just confirming, we don't support --test
without --local
through any codepath except the legacy codepath, right?
This seems correct under that condition, but might make more sense to check for test mode first, then local, since if we did support nonlocal test mode we would still only train one batch.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good point. updated.
if self.local_training: | ||
searcher_length = self.max_length | ||
train_length = self.max_length | ||
else: | ||
searcher_length = TrainUnit._from_searcher_unit( | ||
train_length = TrainUnit._from_searcher_unit( | ||
op.length, self.searcher_unit, self.global_batch_size | ||
) | ||
assert searcher_length | ||
if self.test_mode: | ||
train_length = Batch(1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: reorder your if statements so you aren't mutating variables; that would feel a lot more natural:
if self.test_mode:
train_length = Batch(1)
elif self.local_training:
train_length = self.max_length
else:
train_length = TrainUnit._from_searcher_unit(...)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
duh. thanks.
Description
previous change to enable training lengths longer than searcher length introduced bug where test_mode flag will train for max_length instead of just one batch.
Test Plan
run a script using pytorch trainer API (example for mnist):
verify in stdout that the code trains for 1 batch only.
Commentary (optional)
Checklist
docs/release-notes/
.See Release Note for details.
Ticket