You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am currently using optim.adam to train my network. Let say, I am training my network up to xth epoch and I save my model, what settings in the optim function should I save in order to continue the training?
I notice that if I just load my save model, the loss computed is did not follow the trend. (the loss actually went back to loss computed in the first epoch). There must be some settings I need to reload in order to get back the similar loss.
The way I compare the result is by computing the loss at x + n epoch but I saved my model at xth epoch. After that I just reload my saved model at xth iteration and train to n epoch and compare the loss computed.
Technically speaking they should be similar. I hope someone can shed some light in this issue.
The text was updated successfully, but these errors were encountered:
I am currently using optim.adam to train my network. Let say, I am training my network up to xth epoch and I save my model, what settings in the optim function should I save in order to continue the training?
I notice that if I just load my save model, the loss computed is did not follow the trend. (the loss actually went back to loss computed in the first epoch). There must be some settings I need to reload in order to get back the similar loss.
The way I compare the result is by computing the loss at x + n epoch but I saved my model at xth epoch. After that I just reload my saved model at xth iteration and train to n epoch and compare the loss computed.
Technically speaking they should be similar. I hope someone can shed some light in this issue.
The text was updated successfully, but these errors were encountered: