Skip to content

FreshRicardo/MLP-implementation-with-numpy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

MLP-implementation-with-numpy

The implementation of a fully connected NN with pure numpy.

Introduction

A simple MLP is developed purely based on numpy. At this time, some components for a neural network is included, and others are still in developing. What has been done are as follows:

  • Relu and sigmoid activation.
  • Cross entropy loss.
  • Xavier and MSRA initializations.
  • Mini-batch gradient descent.

Example

Start with a multi-class classification problem. Use MNIST dataset(60000 train samples and 10000 test samples) as an example.

Construct a MLP with two hidden layers. One has 256 neurons, another has 64 neurons. The accuracy on testset reaches 0.9819 after 50 epochs. For details, please refer to example.py.

accuracy

Environment

  • Numpy
  • python=3.6.12

Usage

from NN import Dense, Model
MLP = Model(0.1)
MLP.add(Dense(100,64,activation='relu'))
MLP.add(Dense(64,10,activation='None'))

Todo

  • Add loss functions.
  • Add tanh and ohter activations.
  • Add optimizers.
  • Add learning rate decay.

Reference

About

The implementation of a fully connected NN with pure numpy.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages