Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

[MXNET-#16167] Refactor Optimizer #17400

Merged
merged 25 commits into from
Feb 29, 2020
Merged

[MXNET-#16167] Refactor Optimizer #17400

merged 25 commits into from
Feb 29, 2020

Conversation

szhengac
Copy link
Contributor

@szhengac szhengac commented Jan 21, 2020

Description

Refactor optimizers for MxNet 2.0:

Main change:

  1. Base class Optimizer and Updater are split into two files.
  2. Each optimizer has its own file.
  3. To improve readability, each optimizer has two functions: step and fused_step. Pure ndarray implementation is put in function step while fused_step contains using optimized kernel. Using step or fused_step is controlled by flag use_fused_step. The main reason to have two step functions is that it is hard for an optimization researcher to implement a new optimizer if he refers to the existing implementations such as SGD. In PyTorch, they only use pure python code in optim. So I consider providing two update functions to maintain both readability and efficiency.
  4. update, update_multi_precision, step, and fused_step take as input indices, weights, grads, states, where the length of the list is determined by aggregate_num. When aggregate_num = numpy.inf, all the parameters are aggregated. This change is necessary if we want to implement some complex optimizers such as LBFGS and Barzilai-Borwein step size, which require access to all the parameters in a single function.
  5. Fix weight decay inconsistency Inconsistent weight decay logics in multiple optimizers #9881. Now, it is consistent with PyTorch and Tensorflow.
  6. Clean C++ implementation so that it is consistent with Python code. It is found that a small difference in implementation can lead to a precision difference of 1e-3.
  7. Discard ccSGD and LBSGD. LBSGD is simply SGD + linear scaling + grad accumulation, which shouldn't be an optimizer in optimizer API.

Checklist

Essentials

Please feel free to remove inapplicable items for your PR.

  • The PR title starts with [MXNET-$JIRA_ID], where $JIRA_ID refers to the relevant JIRA issue created (except PRs with tiny changes)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage:
  • Unit tests are added for small changes to verify correctness (e.g. adding a new operator)
  • Nightly tests are added for complicated/long-running ones (e.g. changing distributed kvstore)
  • Build tests will be added for build configuration changes (e.g. adding a new build option with NCCL)
  • Code is well-documented:
  • For user-facing API changes, API doc string has been updated.
  • For new C++ functions in header files, their functionalities and arguments are documented.
  • For new examples, README.md is added to explain the what the example does, the source of the dataset, expected performance on test set and reference to the original paper if applicable
  • Check the API doc at https://mxnet-ci-doc.s3-accelerate.dualstack.amazonaws.com/PR-$PR_ID/$BUILD_ID/index.html
  • To the best of my knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change

Changes

  • Feature1, tests, (and when applicable, API doc)
  • Feature2, tests, (and when applicable, API doc)

Comments

  • If this change is a backward incompatible change, why must this change be made.
  • Interesting edge cases to note here

@szha @eric-haibin-lin @sxjscience @leezu

Copy link
Member

@gigasquid gigasquid left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The clojure code side looks good. Thanks for making the change 💯

Copy link
Member

@eric-haibin-lin eric-haibin-lin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to hold on these changes until 1.7.x or 2.x branch it cut

@sxjscience sxjscience merged commit f70c7b7 into apache:master Feb 29, 2020
@ChaiBapchya ChaiBapchya mentioned this pull request Mar 2, 2020
4 tasks
MoisesHer pushed a commit to MoisesHer/incubator-mxnet that referenced this pull request Apr 10, 2020
* refactor optimizer

* refactor optimizer

* fix svrg test

* fix rmsprop param naming

* fix signum test

* fix pylint and perl test

* fix perl test and signsgd test

* fix

* retrigger ci

* reduce ci overheads
anirudh2290 pushed a commit to anirudh2290/mxnet that referenced this pull request May 29, 2020
* refactor optimizer

* refactor optimizer

* fix svrg test

* fix rmsprop param naming

* fix signum test

* fix pylint and perl test

* fix perl test and signsgd test

* fix

* retrigger ci

* reduce ci overheads
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants