Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Implement & Optimize a few optimizer options #95

Merged
merged 42 commits into from
Jan 28, 2023

Conversation

kozistr
Copy link
Owner

@kozistr kozistr commented Jan 28, 2023

Problem (Why?)

implement & optimize a few features

Solution (What/How?)

  • Move step parameter from state to group. (to reduce computation cost & memory)
  • load betas by group, not parameter.
  • refactor to in-place operations.
  • Adan optimizer.
    • Support max_grad_norm.
  • Lamb optimizer.
    • Support gradient averaging.
  • Lars optimizer
    • Support dampening, nesterov parameters.
    • Fix logic when momentum is 0.

Other changes (bug fixes, small refactors)

nope

Notes

bump version to v2.2.1

@kozistr kozistr added enhancement New feature or request feature New features labels Jan 28, 2023
@kozistr kozistr self-assigned this Jan 28, 2023
@codecov-commenter
Copy link

codecov-commenter commented Jan 28, 2023

Codecov Report

Merging #95 (27d6b99) into main (f6baa63) will increase coverage by 0.02%.
The diff coverage is 100.00%.

📣 This organization is not using Codecov’s GitHub App Integration. We recommend you install it so Codecov can continue to function properly for your repositories. Learn more

@@            Coverage Diff             @@
##             main      #95      +/-   ##
==========================================
+ Coverage   97.73%   97.76%   +0.02%     
==========================================
  Files          37       37              
  Lines        2520     2551      +31     
==========================================
+ Hits         2463     2494      +31     
  Misses         57       57              
Impacted Files Coverage Δ
pytorch_optimizer/optimizer/gc.py 100.00% <ø> (ø)
pytorch_optimizer/optimizer/sgdp.py 100.00% <ø> (ø)
pytorch_optimizer/base/optimizer.py 97.56% <100.00%> (+0.12%) ⬆️
pytorch_optimizer/optimizer/adabelief.py 97.29% <100.00%> (ø)
pytorch_optimizer/optimizer/adabound.py 100.00% <100.00%> (ø)
pytorch_optimizer/optimizer/adai.py 100.00% <100.00%> (ø)
pytorch_optimizer/optimizer/adamp.py 100.00% <100.00%> (ø)
pytorch_optimizer/optimizer/adan.py 100.00% <100.00%> (ø)
pytorch_optimizer/optimizer/adapnm.py 100.00% <100.00%> (ø)
pytorch_optimizer/optimizer/diffgrad.py 100.00% <100.00%> (ø)
... and 10 more

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@kozistr kozistr merged commit ce56167 into main Jan 28, 2023
@kozistr kozistr deleted the refactor/optimizers branch January 28, 2023 11:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request feature New features size/L
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants