This package is dedicated to high-order optimization methods. All the methods can be used similarly to standard PyTorch optimizers.
-
Updated
Nov 10, 2023 - Python
This package is dedicated to high-order optimization methods. All the methods can be used similarly to standard PyTorch optimizers.
0th order optimizers, gradient chaining, random gradient approximation
Add a description, image, and links to the pytorch-optimizers topic page so that developers can more easily learn about it.
To associate your repository with the pytorch-optimizers topic, visit your repo's landing page and select "manage topics."