Skip to content

Commit

Permalink
add a super basic test comparing our grad backward pass to that of py…
Browse files Browse the repository at this point in the history
…torch
  • Loading branch information
karpathy committed Apr 14, 2020
1 parent f779a0c commit ff25789
Show file tree
Hide file tree
Showing 2 changed files with 28 additions and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ print(x.grad) # prints 62.0 - i.e. the numerical value of dy / dx

### Tracing / visualization

Have a look at the jupyter notebook `trace_graph.py` to also produce graphviz visualizations. E.g. this one is of a simple 2D neuron, arrived at by calling `draw_dot` on the code below, and it shows both the data (top number in each node) and the gradient (bottom number in each node).
Have a look at the jupyter notebook `trace_graph.ipynb` to also produce graphviz visualizations. E.g. this one is of a simple 2D neuron, arrived at by calling `draw_dot` on the code below, and it shows both the data (top number in each node) and the gradient (bottom number in each node).

```python
from micrograd import nn
Expand Down
27 changes: 27 additions & 0 deletions test/test_basic.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
import torch
from micrograd.engine import Value
from micrograd import nn

def test_sanity_check():

x = Value(-4.0)
z = 2 * x + 2 + x
q = z.relu() + z * x
h = (z * z).relu()
y = h + q + q * x
y.backward()
xmg, ymg = x, y

x = torch.Tensor([-4.0])
x.requires_grad = True
z = 2 * x + 2 + x
q = z.relu() + z * x
h = (z * z).relu()
y = h + q + q * x
y.backward()
xpt, ypt = x, y

# forward pass went well
assert ymg.data == ypt.data.item()
# backward pass went well
assert xmg.grad == xpt.grad.item()

0 comments on commit ff25789

Please sign in to comment.