Skip to content

A repository re-implement from scratch, fundamentals of any deep learning library like PyTorch or TensorFlow

License

Notifications You must be signed in to change notification settings

RISHIT7/Fundamentals-of-DL-Liibrary

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fundamentals of DL Library

This repository contains re-implementations of fundamental deep learning libraries, inspired by Andrej Karpathy's popular Makemore and Micrograd projects. These projects help users understand the inner workings of deep learning models, gradient-based optimization, and building neural networks from scratch.

Contents

1. Makemore

Makemore is a character-level language model that generates text based on the input data. This project includes:

  • Implementation of a simple feedforward neural network for text generation.
  • Backpropagation and gradient calculation from scratch.
  • Understanding how to train a neural network on character-level data.

2. Micrograd

Micrograd is a minimalistic deep learning library that implements the core idea of automatic differentiation (autodiff) for small neural networks. The project includes:

  • Basic tensor-like class with support for gradients.
  • Forward and backward passes through the network using autodiff.
  • Training a neural network from scratch without relying on deep learning libraries like PyTorch or TensorFlow.

Acknowledgements

Special thanks to Andrej Karpathy for his inspirational deep-learning content, which served as the foundation for the Makemore and Micrograd projects. The link to the original YouTube videos is given below.

Neural Networks: Zero to Hero

About

A repository re-implement from scratch, fundamentals of any deep learning library like PyTorch or TensorFlow

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published