There is a traditional way to optimize neural network's weights - through backpropagation and gradient descent. Though it is quite effective in terms of locating optima, it does not always end up in a global optima (which we may wish for certain, complex tasks). However, metaheuristics (including evolutionary algorithms) are very good in terms of finding a global optimum, especially for tasks with very variable, noisy fitness landscapes.
Here, I will use Neuroevolution of Augmenting Topologies (NEAT), which optimizes neural network's weights as well as its topology [1]. It is implemented from scratch and is tested it with an OpenAI gym environment called balancing a cart pole.
Python3 required along with packages specified in requirements.txt
file.
Install dependencies:
python3 install -U -r requirements.txt
Run NEAT:
python3 neat/src/main.py
Or, run fixed-topology neural network optimization process:
python3 fixed-topology-ne/main.py
NOTE: This was not generated by NEAT. The image was generated during the testing of visualization process.
- Reusable implementation fixed-topology network and its evolution (e.g. for balancing a cart pole from OpenAI gym).
- Reusable implementation of NEAT with all its operators, such as mutation and crossover.
- Genotype to phenotype visualization (using NetworkX graph library).
Contributions welcome! Just open a PR and @tag me!
- Novelty search for NEAT.
- Neuro-plasticity for evolution.
- Co-evolutionary methods? Research needed.
- Make training distributed for getting results faster.
- Easy-to-use pipeline for hyper-parameters (similar to grid search).
- Unit testing module and more unit tests. It would be better to add extra testing layers, i.e. integration, smoke, regression, etc. whatever is necessary for better testability and reliability.
- Docker and package manager to automate the installation and change of packages/modules/etc.
[1] - Kenneth O. Stanley and Risto Miikkulainen. Evolving Neural Networks through Augmenting Topologies. The MIT Press, 2002.