Skip to content

Simple and extensible hypergradient for PyTorch

License

Notifications You must be signed in to change notification settings

moskomule/hypergrad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Simple and extensible hypergradient for PyTorch

Installation

First, install torch and its accompanying torchvision appropriately. Then,

pip install hypergrad

Methods

Implicit hypergradient approximation (via approximated inverse Hessian-vector product)

Implementation of these methods can be found in hypergrad/approximate_ihvp.py

Citation

To cite this repository,

@software{hypergrad,
    author = {Ryuichiro Hataya},
    title = {{hypergrad}},
    url = {https://github.com/moskomule/hypergrad},
    year = {2023}
}

hypergrad is developed as a part of the following research projects:

@inproceedings{hataya2023nystrom,
    author = {Ryuichiro Hataya and Makoto Yamada},
    title = {{Nystr\"om Method for Accurate and Scalable Implicit Differentiation}},
    booktitle = {AISTATS},
    year = {2023}
}