Sharing my Zig weekend project that re-implements Andrej Karpathy’s Micrograd in Zig.
You can find the source code here: GitHub - nurpax/zigrograd: Micrograd in Zig
It’s a neural network engine that can automatically compute gradients for arbitrary scalar-valued expressions. I provide an example that trains a 3-layer neural network to classify hand-written digits (MNIST).
It’s not intended for real use, it was mainly a learning experience for me. I think it turned out quite nicely in Zig. I especially like how well the “arena allocator” pattern works for a machine learning training loop: use two separate arenas, one for “init” that allocates model parameters and other more persistent state, and one “forward” arena that’s used within the training loop to run the forward pass and backpropagation. The latter allocations are reset after each trained minibatch.
A good place to start reading the code is the training loop: https://github.com/nurpax/zigrograd/blob/main/src/main.zig#L89
I hope you find it an interesting.