Skip to content

Self designed and trained simple neural network to classify MNIST handwrite digits.

Notifications You must be signed in to change notification settings

avinoam134/Self-Made-Neural-Network

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

The project is a projectory of my theorethical knowledge of the foundations of Neural Networks and how they work behind the scenes.

  • Every layer, loss or activation function, optimization algorithm, gradient computation, and model in general - was made with no libraries implementation uses. The Whole math was done by hand and executed using Numpy arrays for convinience only!

  • All implementations are organised in the code in self-explanatory names, and the "Tests.py" script contain many functions to demonstrate the many phases completed in the project, aswell as tests (such as implementing and utilizing Gradient&Jacobian Checks) to make sure the math and computations are in check.

  • You may also find many "non-used" functions such as SGD variations that are there to give a sneak-pic of my experimentations along the writing of the project.

  • The project revolves around creating a classification model for a syntethic set of data with various input dimensions and output, with a CrossEntropy loss function, and supportage of both classic FC layers and Residual FC layers in a mode.

  • Enjoy reviewing and playing with the code yourself!

About

Self designed and trained simple neural network to classify MNIST handwrite digits.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages