This repository contains re-implementations of fundamental deep learning libraries, inspired by Andrej Karpathy's popular Makemore and Micrograd projects. These projects help users understand the inner workings of deep learning models, gradient-based optimization, and building neural networks from scratch.
Makemore is a character-level language model that generates text based on the input data. This project includes:
- Implementation of a simple feedforward neural network for text generation.
- Backpropagation and gradient calculation from scratch.
- Understanding how to train a neural network on character-level data.
Micrograd is a minimalistic deep learning library that implements the core idea of automatic differentiation (autodiff) for small neural networks. The project includes:
- Basic tensor-like class with support for gradients.
- Forward and backward passes through the network using autodiff.
- Training a neural network from scratch without relying on deep learning libraries like PyTorch or TensorFlow.
Special thanks to Andrej Karpathy for his inspirational deep-learning content, which served as the foundation for the Makemore and Micrograd projects. The link to the original YouTube videos is given below.