Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement e-prop plasticity #2867

Merged
merged 416 commits into from
Feb 28, 2024
Merged
Show file tree
Hide file tree
Changes from 250 commits
Commits
Show all changes
416 commits
Select commit Hold shift + click to select a range
7ef1b3d
Use auto instead of explicit HistEntryEprop types
akorgor Nov 7, 2023
cfcb9c7
Initialize n_spikes in the constructor
akorgor Nov 7, 2023
ffda252
Improve naming of iterators
akorgor Nov 7, 2023
1e95a4e
Unify retrieval of eprop, update and firing rate reg history
akorgor Nov 7, 2023
6ccefa0
Remove unneeded empty vector guards
akorgor Nov 7, 2023
1c2a080
Change order of history vectors for consistency
akorgor Nov 7, 2023
4a8d8e8
Hike clang-format version to 17 and activate InsertBraces
heplesser Nov 4, 2023
63ff6b4
Prescribe exact clang-format version
heplesser Nov 4, 2023
30c590c
Insert braces around single-line blocks.
heplesser Nov 7, 2023
87a8062
Add const where needed and remove unneeded
akorgor Nov 7, 2023
b2a10af
Re-add necessary checks for empty containers
heplesser Nov 7, 2023
ba34ef6
Merge branch 'eprop_feature' of github.com:jstapmanns/nest-simulator …
heplesser Nov 8, 2023
b1acfa1
Fix formatting
heplesser Nov 8, 2023
599094b
Merge pull request #35 from heplesser/fix-guards
akorgor Nov 8, 2023
2912044
Improve optimize functions
akorgor Nov 9, 2023
be6583d
Fix loops and types in learning signal handling
akorgor Nov 9, 2023
b00220e
Remove unneeded brackets
akorgor Nov 9, 2023
6012a9c
Introduce option to average gradient
akorgor Nov 9, 2023
55913ae
Make get_shift virtual and implement in neurons
akorgor Nov 9, 2023
abf5eb3
Avoid type casting in firing rate reg
akorgor Nov 9, 2023
5bbf400
Remove blank line
akorgor Nov 9, 2023
3d4f00e
Lift restriction resolution = 1 ms
akorgor Nov 9, 2023
f153344
Change type of z and add description
akorgor Nov 9, 2023
e620b44
Make the tests full-scale
akorgor Nov 9, 2023
a356252
Write to history only if eprop neuron receives eprop synapses
heplesser Nov 9, 2023
81eb0ff
Make examples more robust and informative.
heplesser Nov 10, 2023
3adb1c1
Merge branch 'verification_problem' into add-synapse-registration
heplesser Nov 10, 2023
a0312fe
Fix flake8 problem
heplesser Nov 10, 2023
c0fa755
Merge branch 'verification_problem' into add-synapse-registration
heplesser Nov 10, 2023
ba3ca44
Fix learning-signal readout for empty history
heplesser Nov 10, 2023
a2d3fa7
Make examples more robust and informative.
heplesser Nov 10, 2023
af22125
Fix flake8 problem
heplesser Nov 10, 2023
6a7283c
Fix comment on removing spikes in 0th time step
akorgor Nov 10, 2023
b06a52e
Compare in tests with rtol instead of atol
akorgor Nov 10, 2023
2432a91
Add links to NEST and TF reference implementation
akorgor Nov 10, 2023
8b14794
Fix condition in write_update_history
akorgor Nov 10, 2023
fb10cce
Let linter ignore too long line
akorgor Nov 10, 2023
c8a1e4b
Let linter ignore too long line (second try)
akorgor Nov 10, 2023
3304c6b
Add comment on why decreasing the access counter
akorgor Nov 10, 2023
2be2640
Let linter ignore too long line (third try)
akorgor Nov 10, 2023
bd86a5a
Resolve discrepancy between implementation and name of write_learning…
akorgor Nov 10, 2023
276cd36
Add doxygen comments to eprop archiving node
akorgor Nov 10, 2023
05cf6f9
Protect verification from case len(loss) < len(loss_verification)
JesusEV Nov 12, 2023
a7ff7fd
Add missing BadProperty checks
JesusEV Nov 12, 2023
c56f9a8
Improve accuracy of decreasing counter comment
JesusEV Nov 12, 2023
20f9674
Fix formatting
JesusEV Nov 12, 2023
d9620e3
Improve readability of gradient_change functions
JesusEV Nov 12, 2023
b75d6c3
Refactor erase_unneeded_eprop_history() function via lambda function
JesusEV Nov 12, 2023
b90fbdd
Add comprehensive comments to erase_unneeded* functions
JesusEV Nov 12, 2023
5d48b98
Add comments to init_update_history() function
JesusEV Nov 12, 2023
d88c95a
Add comments to write_learning_signal_to_history() function
JesusEV Nov 12, 2023
83e4907
Add comments to functions
JesusEV Nov 12, 2023
b664c12
Add comments to functions
JesusEV Nov 12, 2023
8efeb8c
Add comments to functions count_spike() and reset_spike_count()
JesusEV Nov 12, 2023
c563a8d
Improve comment
JesusEV Nov 12, 2023
b28cdec
Improve comment
JesusEV Nov 12, 2023
f6a1bb5
Improve comment
JesusEV Nov 12, 2023
1969e37
Fix typo
JesusEV Nov 12, 2023
98e880b
Update comments in eprop_archiving_node
JesusEV Nov 13, 2023
fcccfb8
Update comments in gradient_change() functions
JesusEV Nov 13, 2023
4d0bc1e
Fix formatting
JesusEV Nov 13, 2023
ebc6f5c
Fix wrong identation error in userdocs compilation
akorgor Nov 13, 2023
b5ab4b4
Fix dt-dependence of firing rate regularization
akorgor Nov 13, 2023
4df1a62
Make unit-conversion comment more precise
akorgor Nov 13, 2023
25728ed
Fix pylint line-too-long error
akorgor Nov 13, 2023
9c33165
Refactor erase_unneeded_eprop_history() function
JesusEV Nov 13, 2023
dfada25
Improve iterator security during histories cleanup
JesusEV Nov 13, 2023
66d801a
Replace propagator_idx with descriptive and restrictive str variable
JesusEV Nov 12, 2023
0bdb9e0
Improve variable name
JesusEV Nov 13, 2023
9fd8669
Merge pull request #39 from jstapmanns/remove-propagator-idx
JesusEV Nov 14, 2023
65305b8
Fix malformed tables
akorgor Nov 14, 2023
6755da6
Create skeleton for doxygen comments
akorgor Nov 14, 2023
cced367
Merge branch 'eprop_feature' into add-synapse-registration
JesusEV Nov 14, 2023
2c8a38a
Merge pull request #37 from heplesser/add-synapse-registration
JesusEV Nov 14, 2023
93111af
Fix formatting
JesusEV Nov 14, 2023
ddbd373
Make piecewise_linear funcs more readable and marginally more efficient
JesusEV Nov 15, 2023
c02d1de
Fix bug in tutorial regression script
JesusEV Nov 15, 2023
0ec3135
Fix BadProperty checks
akorgor Nov 15, 2023
c67d015
Remove restriction on eta
akorgor Nov 15, 2023
fccd6a1
Initialize z in the constructor
akorgor Nov 16, 2023
6ae9d6e
Remove remaining "leak constant complement"
akorgor Nov 16, 2023
f9fce8e
Increase readability wrt to identity propagator
akorgor Nov 16, 2023
3a88a31
Unify namescape nest scopes
akorgor Nov 18, 2023
555087e
Disentangle propagators by introducing alpha
akorgor Nov 16, 2023
642fb56
Replace leak_propagator_complement with alpha_complement
akorgor Nov 16, 2023
f74ed59
Fix format
akorgor Nov 16, 2023
4a500ec
Merge branch 'eprop_feature' into wip-eprop-doxygen
akorgor Nov 18, 2023
bc06190
State doxygen descriptions more precisely
akorgor Nov 18, 2023
4c6d6c5
Replace identity with unity
JesusEV Nov 20, 2023
63b5762
Remove remaining eprop_regression
akorgor Nov 20, 2023
a32b904
Add missing identity-unity renaming
akorgor Nov 20, 2023
199f550
Merge branch 'eprop_feature' into wip-eprop-doxygen
akorgor Nov 20, 2023
eff2a7b
Add start to multimeters of recurrent neurons
akorgor Nov 22, 2023
5894c07
Include initial weight in weight plotting
akorgor Nov 22, 2023
81b732f
Emphasize difference to iaf_psc_delta
akorgor Nov 22, 2023
b1110b9
Remove unnecessary n_compare comments
akorgor Nov 22, 2023
2f97d19
Declare variables in gradient_change earlier
akorgor Nov 22, 2023
5fe3a37
Rename requires_buffer to avoid ambiguity
akorgor Nov 22, 2023
a1b4435
Merge branch 'eprop_feature' into wip-eprop-doxygen
akorgor Nov 22, 2023
202d54a
Merge pull request #41 from akorgor/wip-eprop-doxygen
akorgor Nov 22, 2023
b611aec
Remove loss from cross_entropy_loss
akorgor Nov 23, 2023
47ed061
Add/remove blank lines and typo
akorgor Nov 23, 2023
1d1c1a5
First version of eprop optimizer as classes; compiles but fails under…
heplesser Nov 24, 2023
9f7b17c
Fixing bugs
heplesser Nov 24, 2023
5a1fb93
Rename propagators and associated variables
akorgor Nov 23, 2023
ae6ff4e
Rename v_ to v_m_
akorgor Nov 23, 2023
4a05e07
Rename a_ to adapt_
akorgor Nov 23, 2023
bbecfba
Match order of z, z_in to header file
akorgor Nov 23, 2023
2daee47
Set multimeter stop to exclude forced spike
akorgor Nov 27, 2023
8f3990c
Fix generation of input spikes for regression for n_batch > 1
JesusEV Nov 28, 2023
716279b
Fix learning signal getter
akorgor Nov 28, 2023
1747974
Add missing initialization of v_th_adapt
akorgor Nov 28, 2023
2d88981
Add check eprop_learning_window_new > eprop_update_interval_
JesusEV Nov 28, 2023
51a34fd
Fix typo
JesusEV Nov 28, 2023
ca31153
Simplify tutorials and test
akorgor Nov 30, 2023
7824976
Rename tutorials
akorgor Nov 30, 2023
5d86319
Prepare adding more tutorials
akorgor Dec 1, 2023
3e570ff
Remove flake8 in front of noqa
akorgor Dec 2, 2023
260a0b1
Add handwriting and infinite loop task
akorgor Dec 4, 2023
c9a46d1
Add erase_unneeded_* functions to eprop_readout
JesusEV Dec 4, 2023
a798b96
Split optimizers on common and synapse-specific part; does not work f…
heplesser Dec 5, 2023
04e10ff
Fix eprop synapse copy and assignment
heplesser Dec 5, 2023
f22f544
Add assertion
heplesser Dec 5, 2023
4783b65
Update eprop examples
heplesser Dec 5, 2023
ace2098
Add minimal modelset for eprop examples
heplesser Dec 5, 2023
21dbc97
Fix bugs in optimizer class implementation
heplesser Dec 5, 2023
3b1faee
Merge branch 'master' of github.com:nest/nest-simulator into jstap-eprop
heplesser Dec 5, 2023
b634673
Merge branch 'jstap-eprop' into optimizer-class
heplesser Dec 5, 2023
183043a
Adapt eprop tests to new parameter settings approach
heplesser Dec 5, 2023
22fe35b
Adapt eprop examples to new parameter settings approach
heplesser Dec 5, 2023
b2bea9d
Fix pylint
heplesser Dec 5, 2023
1dbe3e0
Place optimizer properties in subdict; work in progress
heplesser Dec 5, 2023
f67066c
Merge pull request #46 from heplesser/jstap-eprop
akorgor Dec 6, 2023
20c7140
Add move constructor and assignment to blockvector and connector
heplesser Dec 7, 2023
bef1975
Set optimizer for all instances of eprop_synapse and add move semantics
heplesser Dec 7, 2023
6f3f092
Remove accidentally added debug output
heplesser Dec 7, 2023
d659037
Fix NEST random seed for examples
akorgor Dec 7, 2023
3d06a2e
Merge branch 'optimizer-class-nested-dict' into optimizer-class
heplesser Dec 7, 2023
eb7b96c
Introduce derived EpropArchivingNode classes
akorgor Dec 7, 2023
7e151f6
Replace get_name with is_eprop_recurrent_node
akorgor Dec 7, 2023
c2d65cc
Fix copyright header
akorgor Dec 7, 2023
3288ff1
Replace get_name with is_eprop_recurrent_node
JesusEV Dec 7, 2023
152493a
Trigger eprop update only after t_next_update + shift
JesusEV Dec 7, 2023
456269f
Add additional checks against user errors
heplesser Dec 7, 2023
131ce25
Merge branch 'jstap-eprop' into optimizer-class
heplesser Dec 7, 2023
9518b70
Merge branch 'master' of github.com:nest/nest-simulator into optimize…
heplesser Dec 7, 2023
d15fbca
Fix merge error
heplesser Dec 7, 2023
8fc48db
Add missing return
heplesser Dec 7, 2023
feb759e
Move management of eprop optimizer object to Connector class with spe…
heplesser Dec 8, 2023
6ec2b53
Merge branch 'optimizer-bugfix' into optimizer-class
heplesser Dec 8, 2023
2050e0d
Fix formatting in documentation
akorgor Dec 9, 2023
2e8fdec
Correct index formatting in docs
akorgor Dec 9, 2023
3f6267c
Fix generation of modelsmodule to include necessary headers
heplesser Dec 10, 2023
77cd5c7
Allow setting of nested optimizer parameters
heplesser Dec 10, 2023
834247c
Convert eprop code, test and examples to nested dict with optimizer t…
heplesser Dec 10, 2023
b0167cc
Adapt test to modified eprop_synapse optimizer settings
heplesser Dec 10, 2023
754ff51
Rename surrogate_gradient to avoid name clash
akorgor Dec 11, 2023
af15301
Add more variables to state::get
akorgor Dec 11, 2023
efab736
Link to eprop_plasticity/index
akorgor Dec 12, 2023
3e27574
Apply suggestions from code review
heplesser Dec 13, 2023
9c5283a
Name changes based on review
heplesser Dec 13, 2023
c289055
Merge branch 'optimizer-class' of github.com:heplesser/nest-simulator…
heplesser Dec 13, 2023
7799a93
Place Wmin before Wmax in examples
heplesser Dec 13, 2023
62f5bc0
Now single modelset for eprop
heplesser Dec 13, 2023
8e49110
Renamed EpropOptimizer* to WeightOptimizer*
heplesser Dec 13, 2023
3c475c6
Making optimizer_ member of eprop_synapse private; remove debugging o…
heplesser Dec 13, 2023
c1f9b39
Convert last remaining EpropOptimizer... to WeightOptimizer...
heplesser Dec 13, 2023
abc5cc3
Update modelsets/eprop
heplesser Dec 13, 2023
23ed521
Merge branch 'eprop_feature' of github.com:jstapmanns/nest-simulator …
heplesser Dec 14, 2023
2caa905
Fix E_L adjustment of adaptive threshold
heplesser Dec 14, 2023
82ec2d1
Merge pull request #47 from heplesser/optimizer-class
akorgor Dec 14, 2023
de3b6b8
Merge branch 'master' of github.com:nest/nest-simulator into jstap-eprop
heplesser Dec 14, 2023
2cc2758
Merge pull request #48 from heplesser/jstap-eprop
akorgor Dec 14, 2023
f2cd3f1
Fix compilation in absence of eprop_synapse
heplesser Dec 14, 2023
aab1f57
Much cleaner implementation of Conncetor template specialisations for…
heplesser Dec 14, 2023
c2547d8
Merge pull request #49 from heplesser/jstap-eprop
akorgor Dec 14, 2023
b0e377b
Rename adapting_threshold to V_th_adapt
akorgor Dec 14, 2023
29f599c
Make BadProperty messages consistent
akorgor Dec 14, 2023
a786e9b
Refactor synapse dictionaries in tutorials
akorgor Dec 14, 2023
580f72f
Instead of resolution=1ms, set delay=resolution in tests
akorgor Dec 14, 2023
4fa0bb3
Add _to_history to get_learning_signal to avoid name clash
akorgor Dec 14, 2023
63410bd
Initialize dynamic variables of Adam optimizer
akorgor Dec 15, 2023
6210cdf
Match access specifiers between base and derived node classes
akorgor Dec 15, 2023
b85dc1a
Add technical comments
heplesser Dec 15, 2023
0d1f255
Merge pull request #50 from heplesser/jstap-eprop
akorgor Dec 15, 2023
9cc78b9
Reorganize user documentation and update parameters
akorgor Dec 15, 2023
1fb6a90
Add more information to e-prop functions in node
akorgor Dec 15, 2023
bc77ddb
List criteria to erase in erase functions
akorgor Dec 15, 2023
02c1448
Add _from to name get_learning_signal_history
akorgor Dec 16, 2023
a2f1403
Rename beta1, beta2 to beta_1, beta_2
akorgor Dec 16, 2023
b1b37d7
Simplify comment
akorgor Dec 16, 2023
8988bac
Add more doxygen class headers
akorgor Dec 16, 2023
2a9ca90
Add more doxygen comments
akorgor Dec 16, 2023
078b8c8
Update user documentation
akorgor Dec 16, 2023
9e37e0b
Make developer comments consistent
akorgor Dec 16, 2023
7b7366d
Adjust learning rate validation in weight_optimizer
JesusEV Dec 19, 2023
e22f06f
Adjust lower bound of weight range in WeightOptimizer
JesusEV Dec 19, 2023
f03e896
Rename gradient_change to compute_gradient across eprop models
JesusEV Dec 19, 2023
5e4ecf7
Update eprop example images
JesusEV Dec 20, 2023
a4b2d81
Renamed eprop models to include _bsshslm_2020; compiles, examples and…
heplesser Dec 20, 2023
1b8b092
Tests adapted to renamed eprop models
heplesser Dec 20, 2023
118fd64
Adjusted examples to renamed eprop models
heplesser Dec 20, 2023
220c976
Renamed test file
heplesser Dec 20, 2023
b38be21
Renamed eprop_iaf_bsshslm_2020_adapt to eprop_iaf_adapt_bsshslm_2020 …
heplesser Dec 20, 2023
b563ddd
Remove verification from tutorials
akorgor Dec 20, 2023
8f33438
Add info in what's new
akorgor Dec 20, 2023
ed574ae
Merge branch 'eprop_feature' of github.com:jstapmanns/nest-simulator …
heplesser Dec 20, 2023
ab948fa
Apply suggestions from code review
heplesser Dec 20, 2023
601f87a
Merge branch 'jstap-eprop' of github.com:heplesser/nest-simulator int…
heplesser Dec 20, 2023
48c6f8c
Fix modelsets
heplesser Dec 20, 2023
ec41554
Merge branch 'master' of github.com:nest/nest-simulator into jstap-eprop
heplesser Dec 21, 2023
3253f47
Merge pull request #51 from heplesser/jstap-eprop
akorgor Dec 21, 2023
5a9d0d2
Add explanation of _bsshslm_2020 to user documentation
akorgor Dec 21, 2023
e608e42
Turn plotting on in tutorials
akorgor Dec 21, 2023
ae675f9
Fix rendering of math equations
akorgor Dec 21, 2023
af143a2
Adjust note on similarity to iaf_psc_delta
akorgor Dec 21, 2023
5120f72
Fix type of learning window
akorgor Jan 12, 2024
c18aeca
Fix developer comment on error signal
akorgor Jan 12, 2024
6d7c05e
Add missing arg average_gradient in compute_gradient declaration
akorgor Jan 12, 2024
8a180d6
Fix type of t in HistEntryEpropReadout
akorgor Jan 12, 2024
a53c915
Rename erase_unneeded* to erase_used*
akorgor Jan 17, 2024
540d23b
Fix typo
akorgor Jan 17, 2024
6691e5c
Turn KernelException into an IllegalConnection
akorgor Jan 17, 2024
a3fdbde
Rename alpha_t to alpha
akorgor Jan 18, 2024
1b58a27
Order variables alphabetically
akorgor Jan 19, 2024
beda0c1
Add protection for n_record, n_record == 0
akorgor Jan 19, 2024
0cac4f0
Remove else which is not needed because of return
akorgor Jan 25, 2024
01d0a24
Merge branch 'master' into eprop_feature
akorgor Feb 14, 2024
046e247
Make leak / resting membrane potential naming consistent
akorgor Feb 14, 2024
bea0837
Update models/eprop_synapse_bsshslm_2020.h
akorgor Feb 15, 2024
b875e0c
Update nestkernel/histentry.h
akorgor Feb 15, 2024
f394e3d
Update models/eprop_synapse_bsshslm_2020.h
akorgor Feb 15, 2024
61f17ee
Update nestkernel/histentry.h
akorgor Feb 15, 2024
67769c0
Update nestkernel/histentry.h
akorgor Feb 15, 2024
6fbcf94
Update nestkernel/eprop_archiving_node.h
akorgor Feb 15, 2024
3a00b34
Update nestkernel/eprop_archiving_node.h
akorgor Feb 15, 2024
f384e86
Update models/weight_optimizer.h
akorgor Feb 15, 2024
b0c7f6d
Update models/weight_optimizer.h
akorgor Feb 15, 2024
1afb404
Update models/eprop_iaf_adapt_bsshslm_2020.cpp
akorgor Feb 15, 2024
c612b07
Complete regular_spike_arrival framework
akorgor Feb 16, 2024
12981cc
Fix docs on regular_spike_arrival
akorgor Feb 16, 2024
dee5908
Fix typo
akorgor Feb 23, 2024
56465c1
Fix spelling
akorgor Feb 26, 2024
13797e6
Add n_record note to all tutorials
akorgor Feb 26, 2024
0112c5a
Add start and stop to spike and weight recorder
akorgor Feb 26, 2024
178ea14
Remove old n_record setting
akorgor Feb 26, 2024
5ad82f5
Avoid unnecessary elements in loop to accelerate plotting
akorgor Feb 27, 2024
0ea9212
Turn f- into r-strings to fix flake8 error
akorgor Feb 27, 2024
cdc27d7
Add missing information to the user documentation
akorgor Feb 28, 2024
3fff77a
Merge branch 'master' into eprop_feature
akorgor Feb 28, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 13 additions & 4 deletions build_support/generate_modelsmodule.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@
"""

import argparse
import itertools
import os
import sys
from pathlib import Path
Expand Down Expand Up @@ -109,6 +110,7 @@ def get_models_from_file(model_file):
"public Node": "node",
"public ClopathArchivingNode": "clopath",
"public UrbanczikArchivingNode": "urbanczik",
"public EpropArchivingNode": "neuron",
"typedef binary_neuron": "binary",
"typedef rate_": "rate",
}
Expand Down Expand Up @@ -227,9 +229,7 @@ def generate_modelsmodule():
1. the copyright header.
2. a list of generic NEST includes
3. the list of includes for the models to build into NEST
4. some boilerplate function implementations needed to fulfill the
Module interface
5. the list of model registration lines for the models to build
4. the list of model registration lines for the models to build
into NEST

The code is enriched by structured C++ comments as to make
Expand All @@ -246,7 +246,16 @@ def generate_modelsmodule():
modeldir.mkdir(parents=True, exist_ok=True)
with open(modeldir / fname, "w") as file:
file.write(copyright_header.replace("{{file_name}}", fname))
file.write('\n#include "models.h"\n\n// Generated includes\n#include "config.h"\n')
file.write(
dedent(
"""
#include "models.h"

// Generated includes
#include "config.h"
"""
)
)

for model_type, guards_fnames in includes.items():
file.write(f"\n// {model_type.capitalize()} models\n")
Expand Down
11 changes: 11 additions & 0 deletions doc/htmldoc/examples/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -198,6 +198,16 @@ PyNEST examples
* :doc:`../auto_examples/evaluate_tsodyks2_synapse`


.. grid:: 1 1 2 3

.. grid-item-card:: :doc:`../auto_examples/eprop_plasticity/index`
:img-top: ../static/img/pynest/eprop_supervised_classification_infrastructure.png

* :doc:`/auto_examples/eprop_plasticity/eprop_supervised_classification_evidence-accumulation`
* :doc:`/auto_examples/eprop_plasticity/eprop_supervised_regression_sine-waves`
* :doc:`/auto_examples/eprop_plasticity/eprop_supervised_regression_handwriting`
* :doc:`/auto_examples/eprop_plasticity/eprop_supervised_regression_infinite-loop`


.. grid:: 1 1 2 3

Expand Down Expand Up @@ -332,6 +342,7 @@ PyNEST examples
../auto_examples/astrocytes/astrocyte_interaction
../auto_examples/astrocytes/astrocyte_small_network
../auto_examples/astrocytes/astrocyte_brunel
../auto_examples/eprop_plasticity/index
akorgor marked this conversation as resolved.
Show resolved Hide resolved

.. toctree::
:hidden:
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
22 changes: 22 additions & 0 deletions doc/htmldoc/whats_new/v3.7/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -40,3 +40,25 @@ See examples using astrocyte models:
See connectivity documentation:

* :ref:`tripartite_connectivity`


E-prop plasticity in NEST
-------------------------

Another new NEST feature is eligibility propagation (e-prop) [1]_, a local and
online learning algorithm for recurrent spiking neural networks (RSNNs) that
serves as a biologically plausible approximation to backpropagation through time
(BPTT). It relies on eligibility traces and neuron-specific learning signals to
compute gradients without the need for error propagation backward in time. This
approach aligns with the brain's learning mechanisms and offers a strong
candidate for efficient training of RSNNs in low-power neuromorphic hardware.

For further information, see:

* :doc:`/auto_examples/eprop_plasticity/index`
* :doc:`/models/index_e-prop plasticity`

.. [1] Bellec G, Scherr F, Subramoney F, Hajek E, Salaj D, Legenstein R,
Maass W (2020). A solution to the learning dilemma for recurrent
networks of spiking neurons. Nature Communications, 11:3625.
https://doi.org/10.1038/s41467-020-17236-y
43 changes: 35 additions & 8 deletions libnestutil/block_vector.h
Original file line number Diff line number Diff line change
Expand Up @@ -236,6 +236,14 @@ class BlockVector
*/
void push_back( const value_type_& value );

/**
* @brief Move data to the end of the BlockVector.
* @param value Data to be moved to end of BlockVector.
*
* Moves given data to the element at the end of the BlockVector.
*/
void push_back( value_type_&& value );

/**
* Erases all the elements.
*/
Expand Down Expand Up @@ -313,15 +321,17 @@ class BlockVector
/////////////////////////////////////////////////////////////

template < typename value_type_ >
inline BlockVector< value_type_ >::BlockVector()
: blockmap_( std::vector< std::vector< value_type_ > >( 1, std::vector< value_type_ >( max_block_size ) ) )
BlockVector< value_type_ >::BlockVector()
: blockmap_(
std::vector< std::vector< value_type_ > >( 1, std::move( std::vector< value_type_ >( max_block_size ) ) ) )
, finish_( begin() )
{
}

template < typename value_type_ >
inline BlockVector< value_type_ >::BlockVector( size_t n )
: blockmap_( std::vector< std::vector< value_type_ > >( 1, std::vector< value_type_ >( max_block_size ) ) )
BlockVector< value_type_ >::BlockVector( size_t n )
: blockmap_(
std::vector< std::vector< value_type_ > >( 1, std::move( std::vector< value_type_ >( max_block_size ) ) ) )
, finish_( begin() )
{
size_t num_blocks_needed = std::ceil( static_cast< double >( n ) / max_block_size );
Expand Down Expand Up @@ -394,7 +404,7 @@ BlockVector< value_type_ >::end() const
}

template < typename value_type_ >
inline void
void
BlockVector< value_type_ >::push_back( const value_type_& value )
{
// If this is the last element in the current block, add another block
Expand All @@ -411,7 +421,24 @@ BlockVector< value_type_ >::push_back( const value_type_& value )
}

template < typename value_type_ >
inline void
void
BlockVector< value_type_ >::push_back( value_type_&& value )
{
// If this is the last element in the current block, add another block
if ( finish_.block_it_ == finish_.current_block_end_ - 1 )
{
// Need to get the current position here, then recreate the iterator after we extend the blockmap,
// because after the blockmap is changed the iterator becomes invalid.
const auto current_block = finish_.block_vector_it_ - finish_.block_vector_->blockmap_.begin();
blockmap_.emplace_back( max_block_size );
finish_.block_vector_it_ = finish_.block_vector_->blockmap_.begin() + current_block;
}
*finish_ = std::move( value );
++finish_;
}

template < typename value_type_ >
void
BlockVector< value_type_ >::clear()
{
for ( auto it = blockmap_.begin(); it != blockmap_.end(); ++it )
Expand Down Expand Up @@ -442,7 +469,7 @@ BlockVector< value_type_ >::size() const
}

template < typename value_type_ >
inline typename BlockVector< value_type_ >::iterator
typename BlockVector< value_type_ >::iterator
BlockVector< value_type_ >::erase( const_iterator first, const_iterator last )
{
assert( first.block_vector_ == this );
Expand Down Expand Up @@ -495,7 +522,7 @@ BlockVector< value_type_ >::erase( const_iterator first, const_iterator last )
}

template < typename value_type_ >
inline void
void
BlockVector< value_type_ >::print_blocks() const
{
std::cerr << "this: \t\t" << this << "\n";
Expand Down
1 change: 1 addition & 0 deletions models/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ set(models_sources
rate_neuron_ipn.h rate_neuron_ipn_impl.h
rate_neuron_opn.h rate_neuron_opn_impl.h
rate_transformer_node.h rate_transformer_node_impl.h
weight_optimizer.h weight_optimizer.cpp
${MODELS_SOURCES_GENERATED}
)

Expand Down
Loading
Loading