You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Below I found that you take with torch.no_grad(): on the dataset (i.e., input and target in lines 210-211) rather than on the model in line 213.
As shown in Evaluating pytorch models: with torch.no_grad vs model.eval() and ‘model.eval()’ vs ‘with torch.no_grad()’, it will take too much memory. But I am not sure about taking no with torch.no_grad block on line 213 whether influences the model's outputs.
Thanks for your valuable project.
Below I found that you take
with torch.no_grad():
on the dataset (i.e.,input
andtarget
inlines 210-211
) rather than on the model inline 213
.As shown in Evaluating pytorch models:
with torch.no_grad
vsmodel.eval()
and‘model.eval()’ vs ‘with torch.no_grad()’, it will take too much memory. But I am not sure about taking no
with torch.no_grad
block online 213
whether influences the model's outputs.AlwaysBeDreaming-DFCIL/learners/default.py
Lines 206 to 226 in 50d9c2e
The text was updated successfully, but these errors were encountered: