A pure Python (well, Numpy) implementation of back-propagation
I realized over the weekend that, unfortunately, I didn't know how back-propagation actually works (I just relied on JAX to do it for me).
So I wrote a pure Numpy neural network- with back-prop. Take a look.
If you have any thoughts or feedback, please shoot me an email (or reach out on Twitter).
Some useful resources if you want to undersatnd how backprop works:
- Micrograd, Karpathy's tiny ML framework.
- The Deep Learning Book was an excellent reference for the math behind backprop.