Skip to content

Commit

Permalink
Merge pull request #2867 from jstapmanns/eprop_feature
Browse files Browse the repository at this point in the history
Implement e-prop plasticity according to Bellec G, Scherr F, Subramoney F, Hajek E, Salaj D, Legenstein R, Maass W (2020).
  • Loading branch information
heplesser authored Feb 28, 2024
2 parents 35fd14b + 3fff77a commit ff23f32
Show file tree
Hide file tree
Showing 72 changed files with 9,814 additions and 111 deletions.
17 changes: 13 additions & 4 deletions build_support/generate_modelsmodule.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@
"""

import argparse
import itertools
import os
import sys
from pathlib import Path
Expand Down Expand Up @@ -109,6 +110,7 @@ def get_models_from_file(model_file):
"public Node": "node",
"public ClopathArchivingNode": "clopath",
"public UrbanczikArchivingNode": "urbanczik",
"public EpropArchivingNode": "neuron",
"typedef binary_neuron": "binary",
"typedef rate_": "rate",
}
Expand Down Expand Up @@ -227,9 +229,7 @@ def generate_modelsmodule():
1. the copyright header.
2. a list of generic NEST includes
3. the list of includes for the models to build into NEST
4. some boilerplate function implementations needed to fulfill the
Module interface
5. the list of model registration lines for the models to build
4. the list of model registration lines for the models to build
into NEST
The code is enriched by structured C++ comments as to make
Expand All @@ -246,7 +246,16 @@ def generate_modelsmodule():
modeldir.mkdir(parents=True, exist_ok=True)
with open(modeldir / fname, "w") as file:
file.write(copyright_header.replace("{{file_name}}", fname))
file.write('\n#include "models.h"\n\n// Generated includes\n#include "config.h"\n')
file.write(
dedent(
"""
#include "models.h"
// Generated includes
#include "config.h"
"""
)
)

for model_type, guards_fnames in includes.items():
file.write(f"\n// {model_type.capitalize()} models\n")
Expand Down
11 changes: 11 additions & 0 deletions doc/htmldoc/examples/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -198,6 +198,16 @@ PyNEST examples
* :doc:`../auto_examples/evaluate_tsodyks2_synapse`


.. grid:: 1 1 2 3

.. grid-item-card:: :doc:`../auto_examples/eprop_plasticity/index`
:img-top: ../static/img/pynest/eprop_supervised_classification_infrastructure.png

* :doc:`/auto_examples/eprop_plasticity/eprop_supervised_classification_evidence-accumulation`
* :doc:`/auto_examples/eprop_plasticity/eprop_supervised_regression_sine-waves`
* :doc:`/auto_examples/eprop_plasticity/eprop_supervised_regression_handwriting`
* :doc:`/auto_examples/eprop_plasticity/eprop_supervised_regression_infinite-loop`


.. grid:: 1 1 2 3

Expand Down Expand Up @@ -332,6 +342,7 @@ PyNEST examples
../auto_examples/astrocytes/astrocyte_interaction
../auto_examples/astrocytes/astrocyte_small_network
../auto_examples/astrocytes/astrocyte_brunel
../auto_examples/eprop_plasticity/index

.. toctree::
:hidden:
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
22 changes: 22 additions & 0 deletions doc/htmldoc/whats_new/v3.7/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -40,3 +40,25 @@ See examples using astrocyte models:
See connectivity documentation:

* :ref:`tripartite_connectivity`


E-prop plasticity in NEST
-------------------------

Another new NEST feature is eligibility propagation (e-prop) [1]_, a local and
online learning algorithm for recurrent spiking neural networks (RSNNs) that
serves as a biologically plausible approximation to backpropagation through time
(BPTT). It relies on eligibility traces and neuron-specific learning signals to
compute gradients without the need for error propagation backward in time. This
approach aligns with the brain's learning mechanisms and offers a strong
candidate for efficient training of RSNNs in low-power neuromorphic hardware.

For further information, see:

* :doc:`/auto_examples/eprop_plasticity/index`
* :doc:`/models/index_e-prop plasticity`

.. [1] Bellec G, Scherr F, Subramoney F, Hajek E, Salaj D, Legenstein R,
Maass W (2020). A solution to the learning dilemma for recurrent
networks of spiking neurons. Nature Communications, 11:3625.
https://doi.org/10.1038/s41467-020-17236-y
43 changes: 35 additions & 8 deletions libnestutil/block_vector.h
Original file line number Diff line number Diff line change
Expand Up @@ -236,6 +236,14 @@ class BlockVector
*/
void push_back( const value_type_& value );

/**
* @brief Move data to the end of the BlockVector.
* @param value Data to be moved to end of BlockVector.
*
* Moves given data to the element at the end of the BlockVector.
*/
void push_back( value_type_&& value );

/**
* Erases all the elements.
*/
Expand Down Expand Up @@ -313,15 +321,17 @@ class BlockVector
/////////////////////////////////////////////////////////////

template < typename value_type_ >
inline BlockVector< value_type_ >::BlockVector()
: blockmap_( std::vector< std::vector< value_type_ > >( 1, std::vector< value_type_ >( max_block_size ) ) )
BlockVector< value_type_ >::BlockVector()
: blockmap_(
std::vector< std::vector< value_type_ > >( 1, std::move( std::vector< value_type_ >( max_block_size ) ) ) )
, finish_( begin() )
{
}

template < typename value_type_ >
inline BlockVector< value_type_ >::BlockVector( size_t n )
: blockmap_( std::vector< std::vector< value_type_ > >( 1, std::vector< value_type_ >( max_block_size ) ) )
BlockVector< value_type_ >::BlockVector( size_t n )
: blockmap_(
std::vector< std::vector< value_type_ > >( 1, std::move( std::vector< value_type_ >( max_block_size ) ) ) )
, finish_( begin() )
{
size_t num_blocks_needed = std::ceil( static_cast< double >( n ) / max_block_size );
Expand Down Expand Up @@ -394,7 +404,7 @@ BlockVector< value_type_ >::end() const
}

template < typename value_type_ >
inline void
void
BlockVector< value_type_ >::push_back( const value_type_& value )
{
// If this is the last element in the current block, add another block
Expand All @@ -411,7 +421,24 @@ BlockVector< value_type_ >::push_back( const value_type_& value )
}

template < typename value_type_ >
inline void
void
BlockVector< value_type_ >::push_back( value_type_&& value )
{
// If this is the last element in the current block, add another block
if ( finish_.block_it_ == finish_.current_block_end_ - 1 )
{
// Need to get the current position here, then recreate the iterator after we extend the blockmap,
// because after the blockmap is changed the iterator becomes invalid.
const auto current_block = finish_.block_vector_it_ - finish_.block_vector_->blockmap_.begin();
blockmap_.emplace_back( max_block_size );
finish_.block_vector_it_ = finish_.block_vector_->blockmap_.begin() + current_block;
}
*finish_ = std::move( value );
++finish_;
}

template < typename value_type_ >
void
BlockVector< value_type_ >::clear()
{
for ( auto it = blockmap_.begin(); it != blockmap_.end(); ++it )
Expand Down Expand Up @@ -442,7 +469,7 @@ BlockVector< value_type_ >::size() const
}

template < typename value_type_ >
inline typename BlockVector< value_type_ >::iterator
typename BlockVector< value_type_ >::iterator
BlockVector< value_type_ >::erase( const_iterator first, const_iterator last )
{
assert( first.block_vector_ == this );
Expand Down Expand Up @@ -495,7 +522,7 @@ BlockVector< value_type_ >::erase( const_iterator first, const_iterator last )
}

template < typename value_type_ >
inline void
void
BlockVector< value_type_ >::print_blocks() const
{
std::cerr << "this: \t\t" << this << "\n";
Expand Down
1 change: 1 addition & 0 deletions models/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ set(models_sources
rate_neuron_ipn.h rate_neuron_ipn_impl.h
rate_neuron_opn.h rate_neuron_opn_impl.h
rate_transformer_node.h rate_transformer_node_impl.h
weight_optimizer.h weight_optimizer.cpp
${MODELS_SOURCES_GENERATED}
)

Expand Down
Loading

0 comments on commit ff23f32

Please sign in to comment.