Skip to content

Implementation of ConvNet/CNN in Numpy from scratch.

Notifications You must be signed in to change notification settings

Praneet9/ConvNet_in_Numpy

Repository files navigation

Implementation of ConvNet in Numpy

Started out as an implementation of CNN Layer and turned out to be a complete end to end deep learning architecture implementation. There might be many more improvements and corrections here. Do open an issue if you find anything. Would be great to get contributions on this. If you want to learn/understand some other implementations end to end you can use this repo as a starting point.

As numpy is built to optimize single core operations, it was difficult to include a CNN example run file. If you know how to parallelize numpy operations let me know!

Requirements

  • numpy 1.18.x
  • matplotlib 3.2.x
  • scikit-learn 0.23.x

Implementation

I've tried to implement all of these from scratch, ofcourse with the help from all the freely available articles and codebases.

  • Convolution Layer
  • Dense Layer
  • Pooling Layer
  • Flatten Layer
  • Forward Function
  • Backpropagation Function
  • Sigmoid Activation Function
  • ReLU Activation Function
  • Binary Cross Entropy Loss Function
  • Adam Optimizer
  • Weights Initialization

There is a lot of scope to implement many more things in this repo. For example - Global Average Pooling Layer, Categorical Cross Entropy Loss, Other Activation Functions, etc. Would love to get pull requests for corrections/updates and new implementations.

Example

$ python fcn_model.py

References

About

Implementation of ConvNet/CNN in Numpy from scratch.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages