Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Including AdaBound in the list of Optimizers. #46809

Open
AD2605 opened this issue Oct 24, 2020 · 4 comments
Open

Including AdaBound in the list of Optimizers. #46809

AD2605 opened this issue Oct 24, 2020 · 4 comments
Labels
enhancement Not as big of a feature, but technically not a bug. Should be easy to fix module: optimizer Related to torch.optim triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@AD2605
Copy link

AD2605 commented Oct 24, 2020

🚀 Feature

Hello all,
I would like to add Adabound to the list of existing optimizers in the torch.optim module.
Here is the link to the paper - https://openreview.net/pdf?id=Bkg3g2R9FX

Motivation

In most of my models, I realized Adam started off good but never converged or got stuck even after scheduling the learning rate, and if I started with SGD, the loss turned into NaNs soon. Out of curiosity, I changed the optimizer from Adam to SGD in the middle of the training and noticed better results, and upon researching more I came across the paper mentioned above. Therefore I believe it would be good to have the optimizer listed in PyTorch by default

Pitch

I have already implemented the optimizer, once I get the green flag from your side, I will send the pull request.

cc @vincentqb

@izdeby izdeby added enhancement Not as big of a feature, but technically not a bug. Should be easy to fix module: optimizer Related to torch.optim triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Oct 26, 2020
@albanD
Copy link
Collaborator

albanD commented Oct 12, 2021

Hi, sorry for the delay.
We will be happy to add this new optimizer yes.

We'll review the PR asap.

@AD2605
Copy link
Author

AD2605 commented Oct 12, 2021

Sure !

Thanks a lot !

@hyperkai
Copy link

hyperkai commented Aug 1, 2024

Hey, when will AdaBound() be officially added to the list of optimizers? I cannot find it in the doc.

@AD2605
Copy link
Author

AD2605 commented Aug 1, 2024

Hi @hyperkai

This PR went because of me not being able to being on top of it.
Thanks for the ping.

I can put it up shortly again

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Not as big of a feature, but technically not a bug. Should be easy to fix module: optimizer Related to torch.optim triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

4 participants