Skip to content

Commit

Permalink
update
Browse files Browse the repository at this point in the history
  • Loading branch information
moskomule committed Feb 23, 2023
1 parent f642b57 commit 67af0d2
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ pip install hypergrad

* conjugate gradient
* [Neumann-series approximation](https://proceedings.mlr.press/v108/lorraine20a.html)
* [Nyström method](to_be_updated)
* [Nyström method](https://arxiv.org/abs/2302.09726)

Implementation of these methods can be found in `hypergrad/approximate_ihvp.py`

Expand All @@ -45,6 +45,6 @@ To cite this repository,
author = {Ryuichiro Hataya and Makoto Yamada},
title = {{Nystr\"om Method for Accurate and Scalable Implicit Differentiation}},
booktitle = {AISTATS},
year = {2023}
year = {2023},
}
```
2 changes: 1 addition & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ The essential challenge of such nested problems is that $\nabla_\phi g$ needs ba
f(\theta, \phi; \mathcal{T})$.
One way to address this is to use *approximated implicit differentiation* methods.
`hypergrad` currently supports [conjugate gradient-based approximation](), [Neumann-series approximation](),
and [Nyström method approximation]() like:
and [Nyström method approximation](https://arxiv.org/abs/2302.09726) like:

```python
from hypergrad.approx_hypergrad import nystrom
Expand Down

0 comments on commit 67af0d2

Please sign in to comment.