Skip to content

NishchalMN/Optimizer-Algorithms-for-Neural-Networks-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

Optimizer-Algorithms-for-Neural-Networks-from-scratch

Implemented the optimizer algorithms such as Adam, RMSprop, Momentum, SGD to reduce the error functions during the training of the Neural Networks which results in much faster convergence towards the global minimum of the error function.

We built an ANN from scratch on the MNIST dataset of handwritten digits using python and numpy, and compared the algorithms wrt the speed of convergence.

Releases

No releases published

Packages

No packages published

Languages