Skip to content

Commit

Permalink
Merge pull request #1 from will-newmarch/release-0.2.0
Browse files Browse the repository at this point in the history
Release 0.2.0
  • Loading branch information
will-newmarch authored Mar 17, 2019
2 parents 5b52885 + 5329a51 commit 4859096
Show file tree
Hide file tree
Showing 5 changed files with 143 additions and 20 deletions.
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -61,3 +61,5 @@ typings/

# next.js build output
.next

.idea
3 changes: 3 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
language: node_js
node_js:
- "8"
124 changes: 123 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
[![Build Status](https://travis-ci.org/will-newmarch/intuitive-neural-network.svg?branch=master)](https://travis-ci.org/will-newmarch/intuitive-neural-network)
# Intuitive Neural Network
### When would like to understand what's going on without the mathematics!
### When you would like to understand what's going on without the mathematics!
A JavaScript based, object-orientated approach to a Neural Network library.

The two aims behind creating this library are:
Expand All @@ -9,3 +10,124 @@ The two aims behind creating this library are:
---

Run the XOR example with `npm test`

---

## A Simple Implementation (XOR Problem)

// Build the network...
var network = new Network({
layers: [2,2,1],
bias: false
});

// Training data (x in, y out)
var data = [
{x: [0,0], y: [0]},
{x: [0,1], y: [1]},
{x: [1,0], y: [1]},
{x: [1,1], y: [0]}
];

// Training the network...
var epochs = 10000;
var learningRate = 0.01;

for (var h = 0; h < epochs; h++) {

for (var i = 0; i < data.length; i++) {

let index = Math.floor(Math.random() * data.length);
network
.fire(data[index].x)
.backPropagate(data[index].y)
.applyError(learningRate)
.reset();

}
}
// Done.

// Testing the trained network...
for(var i = 0; i < data.length; i++) {

network.fire(data[i].x);

var activation = network.layers[network.layers.length-1].neurons[0].activation;

// expect Math.round(activation) to equal data[i].y[0]

network.reset();

}
// Done.

---


<a name="Network"></a>

## Network
**Kind**: global class

* [Network](#Network)
* [new Network(settings)](#new_Network_new)
* [.fire(signals)](#Network+fire)
* [.backPropagate(errors)](#Network+backPropagate)
* [.applyError(learningRate)](#Network+applyError)
* [.reset()](#Network+reset)

<a name="new_Network_new"></a>

### new Network(settings)
Constructor for Network


| Param | Type |
| --- | --- |
| settings | <code>object</code> |

<a name="Network+fire"></a>

### network.fire(signals) ⇒
Fire the input layer's Neurons with supplied array of floats

**Kind**: instance method of [<code>Network</code>](#Network)
**Returns**: Network (for chaining purposes)

| Param | Type |
| --- | --- |
| signals | <code>array</code> |

<a name="Network+backPropagate"></a>

### network.backPropagate(errors) ⇒
Initialise back propagation through network with supplied array of floats

**Kind**: instance method of [<code>Network</code>](#Network)
**Returns**: Network (for chaining purposes)

| Param | Type |
| --- | --- |
| errors | <code>array</code> |

<a name="Network+applyError"></a>

### network.applyError(learningRate) ⇒
Trigger each synapse to apply its error to its weight

**Kind**: instance method of [<code>Network</code>](#Network)
**Returns**: Network (for chaining purposes)

| Param | Type |
| --- | --- |
| learningRate | <code>float</code> |

<a name="Network+reset"></a>

### network.reset() ⇒
Reset all the Neurons and Synapses back to their initial state

**Kind**: instance method of [<code>Network</code>](#Network)
**Returns**: Network (for chaining purposes)
6 changes: 6 additions & 0 deletions src/Network.js
Original file line number Diff line number Diff line change
Expand Up @@ -98,6 +98,7 @@ class Network {
/**
* Fire the input layer's Neurons with supplied array of floats
* @param {array} signals
* @returns Network (for chaining purposes)
*/
fire(signals) {
for (var i = 0; i < this.layers[0].neurons.length; i++) {
Expand All @@ -114,6 +115,7 @@ class Network {
/**
* Initialise back propagation through network with supplied array of floats
* @param {array} errors
* @returns Network (for chaining purposes)
*/
backPropagate(errors) {
for (var i = 0; i < errors.length; i++) {
Expand All @@ -125,6 +127,7 @@ class Network {
/**
* Trigger each synapse to apply its error to its weight
* @param {float} learningRate
* @returns Network (for chaining purposes)
*/
applyError(learningRate) {
this.layers.map(l => {
Expand All @@ -136,6 +139,7 @@ class Network {
}
});
});
return this;
}

/**
Expand All @@ -153,6 +157,7 @@ class Network {

/**
* Reset all the Neurons and Synapses back to their initial state
* @returns Network (for chaining purposes)
*/
reset() {
this.layers.map(l => {
Expand All @@ -165,6 +170,7 @@ class Network {
}
});
});
return this;
}

/**
Expand Down
28 changes: 9 additions & 19 deletions xor-problem.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,9 @@ const Network = require('./src/Network.js');

test('library solves XOR problem', () => {

// Various settings...
var learningRate = 0.01;
var epochs = 100000;
var activation = 'sigmoid';

// Build the network...
var network = new Network({
layers: [2,2,1],
hiddenActivationType: activation,
outputActivationType: 'identity',
bias: false
});

Expand All @@ -22,29 +15,27 @@ test('library solves XOR problem', () => {
{x: [1,1], y: [0]}
];

// Training...
// Training the network...
var epochs = 10000;
var learningRate = 0.01;

for (var h = 0; h < epochs; h++) {

for (var i = 0; i < data.length; i++) {

let index = Math.floor(Math.random() * data.length);

network.fire(data[index].x);

network.backPropagate(data[index].y);

network.applyError(learningRate);

network.reset();
network
.fire(data[index].x)
.backPropagate(data[index].y)
.applyError(learningRate)
.reset();

}
}

// Done.

// Testing...

// Testing the trained network...
for(var i = 0; i < data.length; i++) {

network.fire(data[i].x);
Expand All @@ -56,7 +47,6 @@ test('library solves XOR problem', () => {
network.reset();

}

// Done.

});
Expand Down

0 comments on commit 4859096

Please sign in to comment.