Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loss generalization #686

Merged
merged 10 commits into from
Aug 13, 2014
Merged

Loss generalization #686

merged 10 commits into from
Aug 13, 2014

Commits on Aug 13, 2014

  1. Add loss_weight to proto, specifying coefficients for each top blob

    in the objective function.
    jeffdonahue committed Aug 13, 2014
    Configuration menu
    Copy the full SHA
    d0cae53 View commit details
    Browse the repository at this point in the history
  2. Add net tests for loss_weight.

    Check that the loss and gradients throughout the net are appropriately scaled
    for a few loss_weight values, assuming a default weight of 1 in the loss layer
    only.  Also modify test_gradient_check_util to associate a loss of 2 rather
    than 1 with the top blob, so that loss layer tests fail if they don't scale
    their diffs.
    jeffdonahue committed Aug 13, 2014
    Configuration menu
    Copy the full SHA
    7a3ed9b View commit details
    Browse the repository at this point in the history
  3. Generalize loss by allowing any top blob to be used as a loss in which

    its elements are summed with a scalar coefficient.
    
    Forward for layers no longer returns a loss; instead all loss layers must have
    top blobs.  Existing loss layers are given a top blob automatically by
    Net::Init, with an associated top_loss_weight of 1 (set in
    LossLayer::FurtherSetUp).  Due to the increased amount of common SetUp logic,
    the SetUp interface is modified such that all subclasses should normally
    override FurtherSetUp only, which is called by SetUp.
    jeffdonahue committed Aug 13, 2014
    Configuration menu
    Copy the full SHA
    512a626 View commit details
    Browse the repository at this point in the history
  4. Make multiple losses work by inserting split layers and add some test…

    …s for it.
    
    Test that we can call backward with an ACCURACY layer.  This currently fails,
    but should be possible now that we explicitly associate a loss weight with
    each top blob.
    jeffdonahue committed Aug 13, 2014
    Configuration menu
    Copy the full SHA
    6176f5b View commit details
    Browse the repository at this point in the history
  5. Configuration menu
    Copy the full SHA
    e01c867 View commit details
    Browse the repository at this point in the history
  6. Configuration menu
    Copy the full SHA
    7c5dc20 View commit details
    Browse the repository at this point in the history
  7. Disallow in-place computation in SPLIT layer -- has strange effects in

    backward pass when input into a loss.
    jeffdonahue committed Aug 13, 2014
    Configuration menu
    Copy the full SHA
    f50b233 View commit details
    Browse the repository at this point in the history
  8. Configuration menu
    Copy the full SHA
    7d590a8 View commit details
    Browse the repository at this point in the history
  9. Configuration menu
    Copy the full SHA
    f7b4507 View commit details
    Browse the repository at this point in the history
  10. Configuration menu
    Copy the full SHA
    415123b View commit details
    Browse the repository at this point in the history