-
Notifications
You must be signed in to change notification settings - Fork 370
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add symmetric stdp synapse to NEST #218
Conversation
The test python file in manualtests generates this graph: Things I'm not sure about:
|
@@ -0,0 +1,300 @@ | |||
/* | |||
* stdp_connection.h |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the copyright notice, the correct file name should be stated: stdp_symmetric_connection.h
. This is why regressiontests/ticket-659-copyright.py
fails.
Also the formatting of some of the file seems to be off, e.g. see TravisCI output. Please make sure, that you use |
Reference: Vogels et al. (2011) Inhibitory Plasticity Balances Excitation and Inhibition in Sensory Pathways and Memory Networks. Science Vol. 334, Issue 6062, pp. 1569-1573 DOI: 10.1126/science.1211095 http://www.sciencemag.org/content/334/6062/1569.abstract Also adds a python test file to the manualtests folders.
The version in Fedora is 3.7.x which was causing the tests to fail.
b33fc84
to
92b55b6
Compare
@tammoippen - Thank you for notes on the test failures. The new set of commits seem to be OK. |
@flinz @suku248 would be good second reviewers. @sanjayankur31 : a non-manual test would be a great addition. |
Thanks @jakobj - I'll look into writing a non-manual test. |
facilitate_( double_t w, double_t kplus ) | ||
{ | ||
double_t norm_w = ( w / Wmax_ ) + ( lambda_ * eta_ * kplus ); | ||
return norm_w < 1.0 ? norm_w * Wmax_ : Wmax_; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe there is a small error, caused by the weight maximization. Let me demonstrate here, but the same holds for depression.
For given values of w and kplus, you are assigning the new w to be (assuming you don't hit the norm_w < 1.0
):
w_returned
= [( w_old / Wmax_ ) + ( lambda_ * eta_ * kplus )] * Wmax_
= w_old + lambda_ * eta_ * kplus * Wmax
This makes the weight updates scale additionally with Wmax, which I believe should not be the case. For facilitation this should rather be:
w_returned
= w_old + lambda_ * eta_ * kplus
You can change the lines to something akin to this implementation.
Just as a remark: Looking at the corresponding line in stdp_connection for the additive case (mu_plus=0
) I think we get the same scaling with Wmax. @abigailm @mhelias am I getting something wrong or is this an intended behavior?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wasn't sure about this either, but decided to go with the way stdp_connection is implemented (as you observe above).
Hey @sanjayankur31, I had a quick look at your code and the paper. Before you go on diggin into the details below, here is a general point to consider: The Vogels learning rule is indeed not implementable by the current Besides, here are comments on the current class:
|
Hi @flinz - thank you very much for your comments. I'm travelling over the next week or so and will address them as soon as get back. |
One question before I make any more changes - is the suggested practice in nest to simply add more commits on top of the current ones - thus maintaining the complete review and code history, or should I squash commits up to make it all tidy (and then maybe force push)? |
@sanjayankur31: we think it's nicer to just add commits to your branch in order to have a nice and transparent history of code review process. @flinz: many thanks for your nice and solid review and the quick reaction. @heplesser: could you make sure the copyright transfer is handled properly? Thanks! |
I've fixed some of the minor issues and pushed commits. I'll fix the major issues as soon as I can. |
As requested, I've replaced kappa with alpha, and added a comment in the documentation explaining the relation between the two.
|
||
# set up the synapse | ||
syn_spec_synapse = {'weight': weight_pre, 'Wmax': 100., | ||
'kappa': 0.1, 'eta': 0.001, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When I run this test, I get Unused dictionary items: kappa
.
Looking into the default parameter reveals that kappa
is indeed not a parameter.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
kappa was replaced by alpha after the review. The manual test file hasn't been updated since it'll be replaced by an automated test (I'm working on the latter)
TODO - add source files to |
@sanjayankur31 Would you pull the newest changes from master so that Travis can check this PR properly now that #391 is merged? |
This ensures that the synapse can be both excitatory and inhibitory.
Hiya, I've updated the source to use an implementation similar to the triplet synapse in #284. So this is finally ready for a complete review and merge :) |
@@ -112,6 +112,7 @@ | |||
#include "stdp_triplet_connection.h" | |||
#include "stdp_dopa_connection.h" | |||
#include "stdp_pl_connection_hom.h" | |||
#include "vogels_sprekeler_connection.h" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you put this in alphabetical order?
@sanjayankur31 The code looks clean except for two little details, see inline comments. @flinz @suku248 Would you have a final look before we merge? |
Seems fine to me 👍 |
@sanjayankur31 And merging :). |
@heplesser sorry I was on holidays. Too late for the party 👍 |
Add three types of nearest-neighbour STDP: symmetric (but not to be confused with PR nest#218), presynaptic-centered, and restricted symmetric. These are worth implementing because described in a highly- cited review [Morrison A., Diesmann M., and Gerstner W. (2008) Phenomenological models of synaptic plasticity based on spike timing, Biol. Cybern. 98, 459--478]. In these models a spike is taken into account in the weight change rule not with all preceding spikes, but only with some of the nearest, see the three pairing schemes in Fig. 7 in [Morrison et al., 2008]. The implementation relies on two additional variables - presynaptic and postsynaptic traces, like in stdp_synapse. However, their update rules differ from those of the latter. That's why the archiving_node was modified to add one more state variable, nearest_neighbor_Kminus. It is not stored in the node, just computed on the fly when requested. To return nearest_neighbor_Kminus I changed the archiving_node.get_K_values() function signature, because, as fas as I can see, this function is not currently used anywhere. The tests for all three models reside in a single file (but as three separate tests), and are the following: generate two Poisson spike sequences as presynaptic and postsynaptic, feed them to NEST to get the synaptic weight change, then reproduce the weight change independently outside of NEST and check whether the expected weight equals the obtained one.
Add three types of nearest-neighbour STDP: symmetric (but not to be confused with PR nest#218), presynaptic-centered, and restricted symmetric. These are worth implementing because described in a highly- cited review [Morrison A., Diesmann M., and Gerstner W. (2008) Phenomenological models of synaptic plasticity based on spike timing, Biol. Cybern. 98, 459--478]. In these models a spike is taken into account in the weight change rule not with all preceding spikes, but only with some of the nearest, see the three pairing schemes in Fig. 7 in [Morrison et al., 2008]. The implementation relies on two additional variables - presynaptic and postsynaptic traces, like in stdp_synapse. However, their update rules differ from those of the latter. That's why the archiving_node was modified to add one more state variable, nearest_neighbor_Kminus. It is not stored in the node, just computed on the fly when requested. To return nearest_neighbor_Kminus I changed the archiving_node.get_K_values() function signature, because, as fas as I can see, this function is not currently used anywhere. The tests for all three models reside in a single file (but as three separate tests), and are the following: generate two Poisson spike sequences as presynaptic and postsynaptic, feed them to NEST to get the synaptic weight change, then reproduce the weight change independently outside of NEST and check whether the expected weight equals the obtained one.
Add three types of nearest-neighbour STDP: symmetric (but not to be confused with PR nest#218), presynaptic-centered, and restricted symmetric. These are worth implementing because described in a highly- cited review [Morrison A., Diesmann M., and Gerstner W. (2008) Phenomenological models of synaptic plasticity based on spike timing, Biol. Cybern. 98, 459--478]. In these models a spike is taken into account in the weight change rule not with all preceding spikes, but only with some of the nearest, see the three pairing schemes in Fig. 7 in [Morrison et al., 2008]. The implementation relies on two additional variables - presynaptic and postsynaptic traces, like in stdp_synapse. However, their update rules differ from those of the latter. That's why the archiving_node was modified to add one more state variable, nearest_neighbor_Kminus. It is not stored in the node, just computed on the fly when requested. To return nearest_neighbor_Kminus I changed the archiving_node.get_K_values() function signature, because, as fas as I can see, this function is not currently used anywhere. The tests for all three models reside in a single file (but as three separate tests), and are the following: generate two Poisson spike sequences as presynaptic and postsynaptic, feed them to NEST to get the synaptic weight change, then reproduce the weight change independently outside of NEST and check whether the expected weight equals the obtained one.
Reference:
Vogels et al. (2011) Inhibitory Plasticity Balances Excitation and
Inhibition in Sensory Pathways and Memory Networks. Science Vol. 334,
Issue 6062, pp. 1569-1573 DOI: 10.1126/science.1211095
http://www.sciencemag.org/content/334/6062/1569.abstract
Also adds a python test file to the manualtests folders.