Skip to content

Collections of mini-projects and my learning notes created by Han in Summer 2024, marks my 20th Summer in my life.

Notifications You must be signed in to change notification settings

HanCreation/20thSummer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 

Repository files navigation

The 20th Summer Project

This project is collections of mini-projects and my learning notes created by Han in Summer 2024, marks my 20th Summer in my life.

2024Poster

The poster can be found on instagram: @Han.Creation

This project is about trying to learn and build micrograd from scratch which is a basically a mini automatic gradient engine which implements a backpropagation. Also, implementing the micrograd engine to Neural Network (MLP)

Micrograd library originally created by Karpathy, et al.

Learn to build a simple neural network from scratch using only NumPy and Math, without relying on any high-level libraries like TensorFlow, PyTorch, or Keras. The task was to classify digits from the MNIST dataset, which contains images of handwritten digits. Also, added another variant of creating the same based on torch basic library.

Rebuilding Makemore, which is a project on building character-level language models to generate text one character at a time. In this reimplementation, the project explores various makemore language model architectures, including a Bigram language model (both simple and neural network-based), MLP models with and without batch normalization, a Wavenet-like structure, and a simple decoder-only transformer based on the "Attention Is All You Need" paper, basically a GPT

NeuroCraft is a web application that allows users to craft their own neural networks without writing any code. This project includes both a frontend built with React and a backend built with Python & Pytorch with Flask as API. This web-app is suited for deep learning beginners to explore the fundamental of neural network such as linear/dense layer, batch-normalization layer, activation function, drop out layer, train accuracy and test accuracy. Datasets provided are basic datasets which are Digit MNIST and Fashion-MNIST dataset.


Edited, Noted, Coded;

Anything Created by Han 2024

About

Collections of mini-projects and my learning notes created by Han in Summer 2024, marks my 20th Summer in my life.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published