The implementation of a fully connected NN with pure numpy.
A simple MLP is developed purely based on numpy. At this time, some components for a neural network is included, and others are still in developing. What has been done are as follows:
- Relu and sigmoid activation.
- Cross entropy loss.
- Xavier and MSRA initializations.
- Mini-batch gradient descent.
Start with a multi-class classification problem. Use MNIST dataset(60000 train samples and 10000 test samples) as an example.
Construct a MLP with two hidden layers. One has 256 neurons, another has 64 neurons. The accuracy on testset reaches 0.9819 after 50 epochs. For details, please refer to example.py.
- Numpy
- python=3.6.12
from NN import Dense, Model
MLP = Model(0.1)
MLP.add(Dense(100,64,activation='relu'))
MLP.add(Dense(64,10,activation='None'))
- Add loss functions.
- Add tanh and ohter activations.
- Add optimizers.
- Add learning rate decay.