-
Notifications
You must be signed in to change notification settings - Fork 18.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Loss generalization #686
Loss generalization #686
Commits on Aug 13, 2014
-
Add loss_weight to proto, specifying coefficients for each top blob
in the objective function.
Configuration menu - View commit details
-
Copy full SHA for d0cae53 - Browse repository at this point
Copy the full SHA d0cae53View commit details -
Add net tests for loss_weight.
Check that the loss and gradients throughout the net are appropriately scaled for a few loss_weight values, assuming a default weight of 1 in the loss layer only. Also modify test_gradient_check_util to associate a loss of 2 rather than 1 with the top blob, so that loss layer tests fail if they don't scale their diffs.
Configuration menu - View commit details
-
Copy full SHA for 7a3ed9b - Browse repository at this point
Copy the full SHA 7a3ed9bView commit details -
Generalize loss by allowing any top blob to be used as a loss in which
its elements are summed with a scalar coefficient. Forward for layers no longer returns a loss; instead all loss layers must have top blobs. Existing loss layers are given a top blob automatically by Net::Init, with an associated top_loss_weight of 1 (set in LossLayer::FurtherSetUp). Due to the increased amount of common SetUp logic, the SetUp interface is modified such that all subclasses should normally override FurtherSetUp only, which is called by SetUp.
Configuration menu - View commit details
-
Copy full SHA for 512a626 - Browse repository at this point
Copy the full SHA 512a626View commit details -
Make multiple losses work by inserting split layers and add some test…
…s for it. Test that we can call backward with an ACCURACY layer. This currently fails, but should be possible now that we explicitly associate a loss weight with each top blob.
Configuration menu - View commit details
-
Copy full SHA for 6176f5b - Browse repository at this point
Copy the full SHA 6176f5bView commit details -
Net::Init can determine that layers don't need backward if they are not
used to compute the loss.
Configuration menu - View commit details
-
Copy full SHA for e01c867 - Browse repository at this point
Copy the full SHA e01c867View commit details -
Configuration menu - View commit details
-
Copy full SHA for 7c5dc20 - Browse repository at this point
Copy the full SHA 7c5dc20View commit details -
Disallow in-place computation in SPLIT layer -- has strange effects in
backward pass when input into a loss.
Configuration menu - View commit details
-
Copy full SHA for f50b233 - Browse repository at this point
Copy the full SHA f50b233View commit details -
Configuration menu - View commit details
-
Copy full SHA for 7d590a8 - Browse repository at this point
Copy the full SHA 7d590a8View commit details -
Configuration menu - View commit details
-
Copy full SHA for f7b4507 - Browse repository at this point
Copy the full SHA f7b4507View commit details -
Configuration menu - View commit details
-
Copy full SHA for 415123b - Browse repository at this point
Copy the full SHA 415123bView commit details