Implemented the optimizer algorithms such as Adam, RMSprop, Momentum, SGD to reduce the error functions during the training of the Neural Networks which results in much faster convergence towards the global minimum of the error function.
We built an ANN from scratch on the MNIST dataset of handwritten digits using python and numpy, and compared the algorithms wrt the speed of convergence.