Skip to content

Commit

Permalink
tweaks to get pdf to build
Browse files Browse the repository at this point in the history
  • Loading branch information
thesamovar committed Jun 25, 2024
1 parent 55f6979 commit d0feaf1
Show file tree
Hide file tree
Showing 7 changed files with 161 additions and 6 deletions.
5 changes: 4 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -134,4 +134,7 @@ research/tempdata

# Vs code
.vscode
.DS_Store
.DS_Store

# typst
typst.exe
3 changes: 2 additions & 1 deletion myst.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ project:
- COMOB, the project for collaborative modelling of the brain
github: https://github.com/comob-project/snn-sound-localization
# bibliography: []
math:
'\argmax': '\operatorname{argmax}'
exclude:
- ReadMe.md
- paper/sections/**
Expand All @@ -18,7 +20,6 @@ project:
SNN: spiking neural network
COMOB: collaborative modelling of the brain
LIF: leaky integrate-and-fire
F&F: filter-and-fire
DCLS: dilated convolutions with learnable spacings
DDL: differentiable delay layer
MSO: medial superior olive
Expand Down
150 changes: 150 additions & 0 deletions paper/paper.bib
Original file line number Diff line number Diff line change
Expand Up @@ -344,4 +344,154 @@ @inproceedings{
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=4r2ybzJnmN}
}
@article{DalesPrinciple,
title = {Dale’s principle},
journal = {Brain Research Bulletin},
volume = {50},
number = {5},
pages = {349-350},
year = {1999},
issn = {0361-9230},
doi = {https://doi.org/10.1016/S0361-9230(99)00100-8},
url = {https://www.sciencedirect.com/science/article/pii/S0361923099001008},
author = {Piergiorgio Strata and Robin Harvey}
}
@article{LAJ1948,
author = {L. A. Jeffress},
title = {A place theory of sound localization},
journal = {J. Comp. Physiol.},
volume = "41",
number = "",
pages = "35–39",
year = "1948",
DOI = "https://doi.org/10.1037/h0061495"
}
@article{KLWH2001,
author = {Richard Kempter and Christian Leibold and Hermann Wagner and and J. Leo van Hemmen},
title = {Formation of temporal-feature maps by axonal propagation of synaptic learning},
journal = {J. Comp. Physiol.},
volume = "98",
number = "",
pages = "4166-71",
year = "2001",
DOI = "https://doi.org/10.1073/pnas.061369698"
}
@article{EMI2006,
author = {Eugene M Izhikevich},
title = {Polychronization: computation with spikes},
journal = {Neural Comput.},
volume = "18",
number = "",
pages = "245-82",
year = "2006",
DOI = "https://doi.org/10.1162/089976606775093882"
}
@article{JSZK2015,
author = {Max Jaderberg and Karen Simonyan and Andrew Zisserman and Koray Kavukcuoglu},
title = {Spatial Transformer Networks},
journal = {arXiv:1506.02025v3},
volume = "",
number = "",
pages = "",
year = "2015",
DOI = "https://doi.org/10.48550/arXiv.1506.02025"
}
@article{KBTG2013,
author = {Robert R. Kerr and Anthony N. Burkitt and Doreen A. Thomas and Matthieu Gilson and David B. Grayden},
title = {Delay Selection by Spike-Timing-Dependent Plasticity in Recurrent Networks of Spiking Neurons Receiving Oscillatory Inputs},
journal = {PLoS Comput. Biol.},
volume = "9",
number = "",
pages = "e1002897",
year = "2013",
DOI = "https://doi.org/10.1371/journal.pcbi.1002897"
}
@article{HKTI2016,
author = {Hideyuki Kato and Tohru Ikeguchi},
title = {Oscillation, Conduction Delays, and Learning Cooperate to Establish Neural Competition in Recurrent Networks},
journal = { PLoS ONE},
volume = "11",
number = "",
pages = "e0146044",
year = "2016",
DOI = "https://doi.org/10.1371/journal.pone.0146044"
}
@article{MAVT2017,
author = {Mojtaba Madadi Asl and Alireza Valizadeh and Peter A. Tass },
title = {Dendritic and Axonal Propagation Delays Determine Emergent Structures of Neuronal Networks with Plastic Synapses},
journal = {Sci. Rep.},
volume = "7",
number = "",
pages = "39682",
year = "2017",
DOI = "https://doi.org/10.1038/srep39682"
}
@article{BSEI2010,
author = {Botond Szatmáry and Eugene M. Izhikevich},
title = {Spike-Timing Theory of Working Memory},
journal = {PLOS Comput. Biol.},
volume = "6",
number = "",
pages = "e1000879",
year = "2010",
DOI = "https://doi.org/10.1371/journal.pcbi.1000879"
}
@article{EIAS2018,
author = {Akihiro Eguchi and James B. Isbister and Nasir Ahmad and Simon Stringer},
title = {The Emergence of Polychronization and Feature Binding in a Spiking Neural Network Model of the Primate Ventral Visual System},
journal = {Psychological Review},
volume = "125",
number = "",
pages = "545–571",
year = "2018",
DOI = "https://doi.org/10.1037/rev0000103"
}
@article{TM2017,
author = {Takashi Matsubara},
title = {Conduction Delay Learning Model for Unsupervised and Supervised Classification of Spatio-Temporal Spike Patterns},
journal = {Front. Comput. Neurosci.},
volume = "11",
number = "",
pages = "104",
year = "2017",
DOI = "https://doi.org/10.3389/fncom.2017.00104"
}
@article{TM2017,
author = {Takashi Matsubara},
title = {Conduction Delay Learning Model for Unsupervised and Supervised Classification of Spatio-Temporal Spike Patterns},
journal = {Front. Comput. Neurosci.},
volume = "11",
number = "",
pages = "104",
year = "2017",
DOI = "https://doi.org/10.3389/fncom.2017.00104"
}
@article{HHM2023,
author = {Ilyass Hammouamri and Ismail Khalfaoui-Hassani and Timothée Masquelier},
title = {Learning Delays in Spiking Neural Networks using Dilated Convolutions with Learnable Spacings},
journal = {arXiv},
volume = "",
number = "",
pages = "2306.17670",
year = "2023",
DOI = "https://doi.org/10.48550/arXiv.2306.17670"
}
@article{ITT2023,
author = {Ismail Khalfaoui-Hassani and Thomas Pellegrini and Timothée Masquelier},
title = {Dilated convolution with learnable spacings},
journal = {arXiv},
volume = "",
number = "",
pages = "2112.03740v4",
year = "2023",
DOI = "https://doi.org/10.48550/arXiv.2112.03740"
}
1 change: 1 addition & 0 deletions paper/sections/basicmodel/basicmodel.md
Original file line number Diff line number Diff line change
Expand Up @@ -121,6 +121,7 @@ $$W_ho\approx a(1-(\delta/\sigma_\delta)^2) e^{-\delta^2/2\sigma_\delta^2}+b$$

where $\delta=o-N_c h / N_h$, $h$ and $o$ are the indices of the hidden and output neurons, $N_h$ is the number of hidden neurons, $N_c$ the number of output neurons, and $a$, $b$ and $\sigma_\delta$ are parameters to estimate. Using this approximation and the rate-based approximation from before, we get the orange curves in {ref}`tuning-curves-output`. If we use both the Ricker wavelet approximation of $W_{ho}$ and the idealised tuning curves, we get the green curves. All in all, this gives us a 6 parameter model that fits the data extremely well, a significant reduction on the 896 parameters for the full model ($N_\psi N_h+N_h N_c$).

(basic-discussion)=
### Discussion

This subproject was an extension of the original notebook [](../research/3-Starting-Notebook.ipynb) with the aim of understanding the solutions found in more detail. We successfully found a 6-parameter reduced model that behaves extremely similarly to the full model, and we can therefore say that we have largely understood the nature of this solution. We did not look in detail for a deep mathematical reason why this is the solution that is found, and this would make for an interesting follow-up. Are these tuning curves and weights Bayes optimal to reduce the effect of the Poisson spiking noise, for example?
Expand Down
2 changes: 1 addition & 1 deletion paper/sections/contributor_table.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ If you add a contribution, please use one of the following templates (see exampl
* - ???
- [\@a-dtk](https://github.com/a-dtk)
- (TODO)
* - Sara Evers [sara.evers@curie.fr]
* - Sara Evers
- [\@saraevers](https://github.com/saraevers)
- Conducted research ([](../research/IE-neuron-distribution.ipynb)).
* - Ido Aizenbud
Expand Down
4 changes: 2 additions & 2 deletions paper/sections/notebook_map.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,10 +71,10 @@ flowchart LR
: Results of imposing an excitatory only constraint on the neurons. Appears to find solutions that are more like what would be expected from the Jeffress model. (Author: TODO who is luis-rr???.)

[](../research/Learning_delays.ipynb), [](../research/Learning_delays_major_edit2.ipynb) and [](../research/Solving_problem_with_delay_learning.ipynb)
: Delay learning using differentiable delay layer, written up in [](#learning-delays) (Author: Karim Habashy.)
: Delay learning using differentiable delay layer, written up in [](#delay-section) (Author: Karim Habashy.)

[](../research/Quick_Start_Delay_DCLS.ipynb)
: Delay learning using Dilated Convolution with Learnable Spacings, written up in [](#learning-delays). (Author: Balázs Mészáros.)
: Delay learning using Dilated Convolution with Learnable Spacings, written up in [](#delay-section). (Author: Balázs Mészáros.)

[](../research/Noise_robustness.ipynb)
: Test effects of adding Gaussian noise and/or dropout during training phase. Conclusion is that dropout does not help and adding noise decreases performance. (Author: TODO: Who is a-dtk???.)
Expand Down
2 changes: 1 addition & 1 deletion paper/sections/science.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Building on this base model, we explored two main questions: how changing the ne

### Dale's principle

In biological networks most neurons release the same set of transmitters from all of their synapses, and so can be broadly be considered to be excitatory or inhibitory to their post-synaptic partners; a phenomenon known as Dale's principle [@10.1177/003591573502800330;@10.1001/jama.1954.02940400080039;@10.1016/S0361-9230(99)00100-8]. In contrast, most neural network models, including our base model, allow single units to have both positive and negative output weights.
In biological networks most neurons release the same set of transmitters from all of their synapses, and so can be broadly be considered to be excitatory or inhibitory to their post-synaptic partners; a phenomenon known as Dale's principle [@10.1177/003591573502800330;@10.1001/jama.1954.02940400080039;@DalesPrinciple]. In contrast, most neural network models, including our base model, allow single units to have both positive and negative output weights.

To test the impact of restricting units to being either excitatory or inhibitory, we trained our base model across a range of inhibitory:excitatory unit ratios, and tested it's performance on unseen, test data ([](../research/Dales_law.ipynb)). We found that networks which balanced excitation and inhibition performed significantly better than both inhibition-only networks - which perform at chance level as no spikes propagate forward, and excitation-only networks - which were roughly 30% less accurate than balanced networks.

Expand Down

0 comments on commit d0feaf1

Please sign in to comment.