Skip to content

harsh21122/NeuralNetwork

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A personal implementation of neural network from scratch.

The implementation is for below configuration

  • activation function - relu, sigmoid, tanh, and softmax.
  • optimization algorithms - gradient descent, gradient descent with momentum, NAG, AdaGrad, RMSProp, and Adam.

The test.ipynb shows results on FASHION Mnist dataset using different configuration. Please refer to it.

About

A simple but detailed implementation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published