That's a project that I loved to make to automate a homework that's been assigned to me. And I found it quite interesting.
However, Python is still a horrible language to use.
- Now I've upgraded the system, and it now has more built-in activation functions.
- I've added the ability to use multiclass activation functions, such as the built-in Softmax.
- Refined some of the project and fixed bugs.
- Added the ability to control what happens to each layer at any given calculation stage.
Now we're talking, I'm upgrading the system to use GUIs. And for some reason python libraries suck at doing this, so I'm building my own shell around them.
- Added the ability to control how many inputs to, well, input.
- Made a custom controllable and configurable input field.
- Now the activation function frame has been implemented mostly fully.
- Activation functions are separated by their input and output types.
- They can also be plotted using a plotter, for better understanding.
- Activation functions can now be co-graphed.
- Graphs can now be cleared too.
- Layers can be set up with a matrix of input fields, making all the weights of the neural network.
- Weights can be adjusted from their heights to their weights, with adjusting all adjacent weights, inclusive to the matrices of input and output.
- And finally, a calculation button to calculate the overall output of the neural network.
- Instead of going through the hassle of reprogramming the weight frame in the name of bias frame, I just made a uni-bias input field, applying for the biases.
- Now you're able to adjust the layer count dynamically.