Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Split out train and update_batch_norm in Librispeech workloads #163

Open
znado opened this issue Oct 8, 2022 · 0 comments
Open

Split out train and update_batch_norm in Librispeech workloads #163

znado opened this issue Oct 8, 2022 · 0 comments

Comments

@znado
Copy link
Contributor

znado commented Oct 8, 2022

Currently update_batch_norm just runs the librispeech workloads in train mode, which also runs dropout in train mode. The purpose of having separate mode and update_batch_norm kwargs to model_fn() was so that submitters could separate which they want to update, if desired. We can update Conformer.__call__ and Deepspeech.__call__ to take both train and update_batch_norm and pass them to dropout/BN respectively.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant