This project does not have as purpose to be used to develop deep learning models. I just aim to develop this framework by myself to get a better understanding of how neural nets work, and for fun of course.
Explore the docs »
Request Feature
Table of Contents
This project easily implement from scratch any feedforward network that you wish to model, using the famous backpropagation algorithm, popularized by the article Learning representations by back-propagating errors written by David E.Rumelhart, Geoffrey E. Hinton and Ronald J. Williams (1986), to learn from the training samples.
The algorithm implemented is also well developed and explained in Deep Learning by Ian Goodfellow et al. (2016): Algorithm 6.4.
It can handles different layers of different sizes, activation functions, loss functions and weight regularization. All of this is detailed bellow in Create a model.
-
Clone the repo
git clone git@github.com:art-test-stack/homemade_neural_network.git
-
Create a virtual environment
For example I use virtualenv:
virtualenv -p python 3.10 venv
-
Install pip packages
pip install -r requirements.txt
Use MNIST or doodle dataset for image classification.
You can easily create the doodle dataset by running
python create_dataset.py
To create a model architecture, you have to specify it in configs/basic_config.yaml. You can change the path directory in settings.py.
You have to specify the:
-
Global parameters:
loss
: the loss function of the model for training. It can be'cross_entropy'
or'mse'
(for now).lrate
: the learning rate for training. It has to be afloat
.wrt
: the weight regularization type. It can be'L1'
,'L2'
orNone
.wreg
: the regularization weight. It has to be afloat
.
-
Layers:
input
: the size of your (flattened) input. It has to be anint
.hidden_layers
: is a list of layers with the following structure.size
: number of node of the layer. It has to be anint
.act
: the activation function. It can be'sigmoid'
,'tanh'
,'linear'
,'relu'
or'leaky_relu'
. Default:'linear'
.wr
: weight range for initialization. It can be a tuple of twofloat
between those the weights are uniformaly initialized or it can be'glorot'
for Xavier Glorot weight initialization.br
: same thanwr
for the biases.
Just run:
python train.py
This framework permits to easily create a neural network without coding, and to train it on any data. So, anyone who want to create a neural network but don't know how to code can use it as a first step to see how neural nets work !
However, I don't recommand it it's better to code lol.
- Change layers and module internal structure
- Change backpropagation call
- Add different optimizers than SGD
- Add reccurent and convolutional layers
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
Distributed under the MIT License. See LICENSE.txt
for more information.
Arthur Testard - testardarthur@gmail.com
Project Link: https://github.com/art-test-stack/homemade_neural_network