-
-
Notifications
You must be signed in to change notification settings - Fork 23
Dann Object
When you create a neural network, you need to specify the size of the input & output layers.
const nn = new Dann(2,2);
- arch
This value represents the architecture of the model in the form of an array.
- lr
This defines the learning rate of the model. This value is set to0.001
by default.
- epoch
This is an empty value. This is meant for you to increase whenever you have completed one epoch. This serves as a way to save the number of epochs along with the weights in the dannData.json file.
- loss
This is the most recent loss value of the model. If the model has never been trained before, this value will be set to 0.
This function feeds data through the model to obtain an output.
- input
Takes an array of inputs to feed forward through the network.
- options (optional)
An object including specific properties.
- returns
Returns an array of outputs
Train the model's weights.
- input
Takes an array of inputs.
- target
Takes an array of desired outputs.
- options (optional)
An object including specific properties.
This function mutates each weights randomly. This is for Neuroevolution tasks.
- range
This will multiply with a random number from -range to range and add to each weight.
- probability
The probability of a weight being affected by a random mutation. Ranging from 0 to 1. Setting this value to 1 would mutate all the model's weights.
This function mutates the weights. This is for Neuroevolution tasks.
- percent
This will multiply by percent and add to each weight.
Adds one hidden layer.
- size
Layer size, the number of neurons in the layer.
- activation
Takes a string of the activation function's name. If left empty, the activation function will be set to'sigmoid'
by default. See available activation functions Here.
Creates the weights. This function should be called after all the hidden layers were added. The optional parameters determine the range in which starting weights are going to be set randomly. If no arguments are specified, weights are going to be set in between -1 and 1.
- min (optional)
The minimum range value.
- max (optional)
The maximum range value.
Sets the activation function of the output
- activation
Takes a string of the activation function's name. If this function is not called, the activation function will be set to 'sigmoid' by default. See available activation functions here.
Set the loss function of the model
- lossfunc
Takes a string of the loss function's name. If this function is not called, the loss function will be set to 'mse' by default. See available loss functions here.
Displays information about the model in the console.
- options (optional)
An object including specific properties.
saves a name.json file containing information about the network and its current state. When the function is called, a local file dialogue is opened by the browser.
saves a json file containing information about the network and its current state in ./savedDanns/name/modelData.json. It also saves the losses in ./savedDanns/name/losses.csv if the Dann.recordLoss object property is set to true.
When this function is called, an input tag requesting a file appears on screen. When clicked, it opens a local file dialogue. Once the appropriate file is selected the dann data automatically uploads. The filename argument is not required for this version since the browser dialog takes care of it.
Load a previously saved json file from ./savedDanns/. If the network's architechture is not the same, it is going to overwrite the Dann object.
Here is a nerural network solving XOR:
const Dann = require('dannjs').dann; //nodejs only
// XOR neural network
const nn = new Dann(2,1);
nn.addHiddenLayer(4,'tanH');
nn.outputActivation('sigmoid');
nn.makeWeights();
// feeding data to the model before training
nn.feedForward([0,0],{log:true});
// training the model
const epoch = 10000;
for (let e = 0; e < epoch; e++) {
nn.backpropagate([0,0],[0]);
nn.backpropagate([1,1],[0]);
nn.backpropagate([0,1],[1]);
nn.backpropagate([1,0],[1]);
}
// feeding data to the model after training
nn.feedForward([0,0],{log:true});