-
Notifications
You must be signed in to change notification settings - Fork 18.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix dropout backward in TEST phase #502
Conversation
@jeffdonahue I wasn't sure if instead of copying the top_diff to the bottom_diff, now that could be replaced with |
// exhaustive gradient check. | ||
checker.CheckGradient(&layer, &(this->blob_bottom_vec_), | ||
&(this->blob_top_vec_)); | ||
} else { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this case really be in the tests? You can just skip this test with a filter on a machine where you don't want to run it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just copied the code from the TRAIN mode, so don't know who add that in the first place. Maybe we can remove it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Yangqing I checked the log and it seems that you added the check of major >=2
#502 (diff)
Should we keep it or remove it?
@sguada I agree we should be able to compute the gradient in test, but I'm less sure I'm comfortable with the copy and pointer checking. Please rebase for a clean merge too. |
@shelhamer talking with @jeffdonahue we thought it was safer to just do copy, and add the logic to copy to avoid copying when source and destination are the same. I wasn't sure if it was safe to use |
I think there could a case where during TRAIN the data is not shared, but then after running in TEST mode becomes shared, so once is back to TRAIN is shared. Although this case may never appear since we usually use this layer in-place. |
Ok, fair enough! Rebase and merge as you like. On Fri, Jun 27, 2014 at 6:28 PM, Sergio Guadarrama <[email protected]
|
Conflicts: src/caffe/layers/dropout_layer.cpp src/caffe/layers/dropout_layer.cu
It is rebased and ready to merge, once @Yangqing confirm that the |
|
Fix dropout backward in TEST phase
Fix dropout backward in TEST phase
This PR just add the option to pass gradients backwards during the
TEST
phase.