Skip to content

Latest commit

 

History

History
70 lines (54 loc) · 1.82 KB

README.md

File metadata and controls

70 lines (54 loc) · 1.82 KB

micrograd.c

Port of Karpathy's migrograd in pure C. Micrograd is a tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API.

Made some improvements:

  • Achieved 100% accuracy on 100 samples of make_moons under 100 epochs (AAA😊).
  • Better memory alloc in gradient accumulation leading to lightspeed faster than Karpathy’s 🎉

Demo

demo.mov

Quick Start

cd micrograd.c
make
./main
./train

Example Usage

#include "../micrograd.c/nn.h"
#include "../micrograd.c/engine.h"
#include "test.h"

Value* a = value_new(-4.0);
Value* b = value_new(2.0);
Value* c = value_add(a, b);
Value* d = value_add(value_mul(a, b), value_pow(b, 3));
c = value_add(c, value_add(c, value_new(1)));
c = value_add(c, value_add(value_add(value_new(1), c), value_neg(a)));
d = value_add(d, value_add(value_mul(d, value_new(2)), value_relu(value_add(b, a))));
d = value_add(d, value_add(value_mul(value_new(3), d), value_relu(value_sub(b, a))));
Value* e = value_sub(c, d);
Value* f = value_pow(e, 2);
Value* g = value_div(f, value_new(2.0));
g = value_add(g, value_div(value_new(10.0), f));
backward(g);

double tol = 1e-4; 
printf("g->data: %.6f\n", g->data);

backward(g);

printf("a->grad: %.6f\n", a->grad);
printf("b->grad: %.6f\n", b->grad);

Train a neural net

train.c has logic for training a simple multi-layer perceptron on 100 samples of make_moons dataset.

cd micrograd.c
make
./train

Test

test.c performs test for all possible operations and sanity checks for both nn and engine.

cd micrograd.c
make
./main

License

MIT