Skip to content

Why the different type of schedulers/annealing rates for optimizer? #49

Answered by brando90
brando90 asked this question in Q&A
Discussion options

You must be logged in to vote

Oh I just noticed there is this:

        # reduce the learning rate
        if opts.no_validation:
            scheduler.step()
        else:
            scheduler.step(loss_valid)

does that mean that it only reduces the learning rate if does not detect a change for some number of epchos but with respect to the validation loss? (btw why not just feed the validation error or acc instead)

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@yangky11
Comment options

Answer selected by yangky11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants