Skip to content

A small autograd engine with backpropagation (reverse-mode autodiff) over a dynamically built directed acyclic graph (DAG) and a small neural networks library on top of it following a PyTorch-like API.

Notifications You must be signed in to change notification settings

Parviz-S/microtorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

microtorch

A small autograd engine with backpropagation (reverse-mode autodiff) over a dynamically built directed acyclic graph (DAG) and a small neural networks library on top of it following a PyTorch-like API. The DAG only operates over scalar values (unlike n-dimensional Tensors in PyTorch), so e.g. we chop up each neuron into all of its individual tiny adds and multiplies. However, this is enough to build up entire deep neural nets doing binary classification, as the demo notebook shows. Project, which is a result of ongoing research and exploration to deeply understand DNNs at a fundamental level.

Installation

You can download the repo and use it locally to test it.

Example usage

Below is an example showing a number of possible supported operations (as of Nov 10, 2024):

from micrograd.engine import Value

a = Value(-4.0)
b = Value(2.0)
c = a + b
d = a * b + b**3
c += c + 1
c += 1 + c + (-a)
d += d * 2 + (b + a).tanh()
d += 3 * d + (b - a).tanh()
e = c - d
f = e**2
g = f / 2.0
g += 10.0 / f
print(f'{g.data:.4f}') # prints 24.7041, the outcome of this forward pass
g.backward()
print(f'{a.grad:.4f}') # prints 138.8338, i.e. the numerical value of dg/da
print(f'{b.grad:.4f}') # prints 645.5773, i.e. the numerical value of dg/db

Tracing / visualization

You can see a visualization of the expressions as you build them using graphviz which is implemented in the notebook creating_microtorch.ipynb.

About

A small autograd engine with backpropagation (reverse-mode autodiff) over a dynamically built directed acyclic graph (DAG) and a small neural networks library on top of it following a PyTorch-like API.

Topics

Resources

Stars

Watchers

Forks