Experiments with Neural Network in Handwritten Digits Classification with MNIST data:
- Simple Neural network with 1 hidden layer
- Neural network with 2 hidden layers with 2 activation functions: ReLU, Tanh
- Neural network with batch normalization
- Neural network with He weights initialisation for ReLU activation
- Neural network with dropout to reduce overfitting.