The first notebook Planar_data_classification_with_one_hidden_layer in this repository covers a simple implementation of a neural network, which will only have one hidden layer. This notebook covers below steps:
-
Implement a 2-class neural network with a single hidden layer
-
Use units with a non-linear activation function, such as tanh
-
Compute the cross entropy loss
-
Implement forward and backward propagation
The second notebook Building your Deep Neural Network Step by Step in this repository takes it one step further and explains steps to build a deep neural network with as many layers as we want! The model highlights include:
- Using non-linear units like ReLU to improve your mode
- Building a deeper neural network (with more than 1 hidden layer)
- Implementing an easy-to-use neural network class which can easily be integrated into an OOP code base for building complex services.