-
Notifications
You must be signed in to change notification settings - Fork 877
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for more Dadaptation #455
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
get_optimizer
method looks like it would be better to refactor to make some parts common. If it is difficult to deal with, I will update it after the merge.
Thank you for this! I will take a look when I have time! |
I found something interesting——pytorch_optimizer |
Thank you for this! I've merged. Sorry for the delay.
|
add more optimizer_type for DAdaptation
DAdaptation(DAdaptAdam)
DAdaptAdaGrad
DAdaptAdan
DAdaptSGD
i will test tomorrow