-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[RFC] Deprecate callback hooks on_init_start
and on_init_end
#10894
Comments
I agree this part
is probably the strangest about these hooks. Currently For context, the init hooks were introduced way back in #889 |
on_init_start
and on_init_end
hookson_init_start
and on_init_end
Is there a strategy for migrating away from these hooks? I've been using them to create a custom logger: def on_init_end(self, trainer: 'pl.Trainer') -> None:
import pathlib
self.log_dir = pathlib.Path(trainer.log_dir)
self.log_fpath = self.log_dir / 'text_logs.log'
self._log = _InstanceLogger.from_instance(trainer, self.log_fpath)
self._log.info('on_init_end')
self._log.info('sys.argv = {!r}'.format(sys.argv))
trainer.text_logger = self
if self.args is not None:
self._log.info('args_dict = {}'.format(ub.repr2(self.args.__dict__, nl=1, sort=0))) Am I SOL when 1.8 drops? |
Have you tried the |
Hey! |
@NamanPundir Can you share more from that error message please? Please note that |
ahh i figured. Anyways I am using huggingface trainer for earlystopping now. Its working. :) Thanks |
Discussed in #10677
Part of #7740
Originally posted by daniellepintz November 22, 2021
These hooks are called when trainer initialization begins and ends, before the model has been set, essentially allowing the user to modify the Trainer constructor. Should we be giving the user this much control over Trainer constructor? Are there scenarios where this is needed? Or can we deprecate these hooks?
https://github.com/PyTorchLightning/pytorch-lightning/blob/338f3cf63686935355c749920b2f298f3d18a26f/pytorch_lightning/trainer/callback_hook.py#L55-L63
@carmocca @tchaton @awaelchli do you know how these hooks are used? Have you seen any examples of these being used by the community? These hooks go way, way back, but I can't think of when they'd be needed given the user handles the Trainer initialization. It's also unclear when
on_init_start
actually happens: does that mean callbacks should be the first thing initialized?We also have correctness issues for these hooks when callbacks are passed directly to the Trainer vs configured from the LightningModule: https://github.com/PyTorchLightning/pytorch-lightning/blob/f407a00cec59886cbaf8816630f6760e34926bd5/pytorch_lightning/core/lightning.py#L1140-L1142
Therefore, as a user, it seems more concise to write this:
vs this:
cc @awaelchli @ananthsub @tchaton @carmocca @Borda @ninginthecloud
The text was updated successfully, but these errors were encountered: