-
Notifications
You must be signed in to change notification settings - Fork 370
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add option to generate connections from NumPy arrays #1429
Add option to generate connections from NumPy arrays #1429
Conversation
Hi @hakonsbm!
|
@alberto-antonietti Yes, it is possible to use non-unique elements. |
Great! That is what we would like to have with #1423 and HBP Ticket #482997. |
Made weight and delay optional, added optional receptor_type specification.
@alberto-antonietti: the reference to this PR in your comment is most likely not what you wanted. Can you please check and update? Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very, very nice! My comments are plenty but rather minor. Many thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for your work! I agree with Jochen's comments. Also added a few minor suggestions!
@@ -194,6 +194,14 @@ class NodeCollection | |||
*/ | |||
static NodeCollectionPTR create( const TokenArray& node_ids ); | |||
|
|||
/** | |||
* Create a NodeCollection from a single node ID. Results in a primitive. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Results in a primitive
: Is this sentence complete?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A primitive is a NodeCollection that is contiguous and homogeneous. It is used as a noun throughout the file, so if the wording is confusing, it probably has to be changed everywhere.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I see. In that case it is best to leave primitive
as it is.
Hi @jougs, I updated the reference, sorry! @hakonsbm I do not know if I can add something meaningful to the two reviews of @jougs and @sarakonradi, but I will try to test it as soon as I have a bit of time (given the incoming end of SGA2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Many thanks for addressing all my comments in such a timely fashion. Good to go from my side.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @hakonsbm for the great work!
I have added a few comments from the "user" perspective, to provide better information about the exact errors that could be triggered.
Do you think that the limitation of setting only synapse_model
, weights
, delays
, and receptors
could be relaxed?
I do not know how hard that could be, but I can think of some of my simulations where I connect non-unique neurons (therefore I cannot use the standard nest.Connect()
between NodeCollections
) and I set other syn_spec
, especially if I have synaptic plasticity.
doc/guides/from_nest2_to_nest3.rst
Outdated
nest.Create('iaf_psc_alpha', 10) | ||
# Node IDs in the arrays must address existing nodes, but may occur multiple times. | ||
sources = np.array([1, 5, 7, 5], dtype=np.uint64) | ||
targets = np.array([2, 2, 4, 4], dtype=np.uint64) | ||
nest.Connect(sources, targets) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If I run this code snippet, I get an error:
----> 5 nest.Connect(sources, targets)
~/workspace/nest-simulator/b/lib/python3.6/site-packages/nest/ll_api.py in stack_checker_func(*args, **kwargs)
245 def stack_checker_func(*args, **kwargs):
246 if not get_debug():
--> 247 return f(*args, **kwargs)
248 else:
249 sr('count')
~/workspace/nest-simulator/b/lib/python3.6/site-packages/nest/lib/hl_api_connections.py in Connect(pre, post, conn_spec, syn_spec, return_synapsecollection)
224 if return_synapsecollection:
225 raise ValueError("SynapseCollection cannot be returned when connecting two arrays of node IDs")
--> 226 weights = numpy.array(processed_syn_spec['weight']) if 'weight' in processed_syn_spec else None
227 delays = numpy.array(processed_syn_spec['delay']) if 'delay' in processed_syn_spec else None
228 receptor_type = (numpy.array(processed_syn_spec['receptor_type'])
TypeError: argument of type 'NoneType' is not iterable
I do not get it if I specify the synapse model ( nest.Connect(sources, targets, syn_spec={"synapse_model" : "static_synapse"})
)
Would it be useful to force the specification of synapse_model
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, we are forcing the specification of synapse_model
. I just forgot to update the documentation, but it's fixed now.
if len(set(processed_syn_spec.keys()) - set(['weight', 'delay', 'synapse_model', 'receptor_type'])) != 0: | ||
raise ValueError("When connecting two arrays of node IDs, the synapse specification dictionary can " | ||
"only contain weights, delays, synapse model, and r_port.") | ||
connect_arrays(pre, post, weights, delays, receptor_type, synapse_model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see that its documented, however if one calls by mistake something like nest.Connect(pre, post, syn_spec={"synapse_model" : "static_synapse", "weight": 2.0})
, the error message he gets is not very informative.
~/workspace/nest-simulator/b/lib/python3.6/site-packages/nest/ll_api.py in stack_checker_func(*args, **kwargs)
245 def stack_checker_func(*args, **kwargs):
246 if not get_debug():
--> 247 return f(*args, **kwargs)
248 else:
249 sr('count')
~/workspace/nest-simulator/b/lib/python3.6/site-packages/nest/lib/hl_api_connections.py in Connect(pre, post, conn_spec, syn_spec, return_synapsecollection)
237 raise ValueError("When connecting two arrays of node IDs, the synapse specification dictionary can "
238 "only contain weights, delays, synapse model, and r_port.")
--> 239 connect_arrays(pre, post, weights, delays, receptor_type, synapse_model)
240 return
241
pynestkernel.pyx in pynestkernel.NESTEngine.connect_arrays()
TypeError: len() of unsized object
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Apparently one can create a 0-dimension NumPy array with for example np.array(2.0)
, which we didn't check for when connecting. I have added some checks for it now.
@@ -233,9 +232,3 @@ def fixdict(d): | |||
projections = fixdict(projections) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if "weight" is a list/np.array of len != len(sources) and len(targets), the following error is Raised.
e.g.
n=10
nest.Create("iaf_cond_exp")
pre = np.random.randint(1, n+1, 100)
post = np.copy(pre)
np.random.shuffle(post)
nest.Connect(pre, post, syn_spec={"synapse_model" : "static_synapse", "weight": [2.0, 3.0]})
Error:
~/workspace/nest-simulator/b/lib/python3.6/site-packages/nest/lib/hl_api_connections.py in Connect(pre, post, conn_spec, syn_spec, return_synapsecollection)
217 # to the right formats.
218 processed_syn_spec = _process_syn_spec(
--> 219 syn_spec, processed_conn_spec, len(pre), len(post))
220
221 # If pre and post are arrays of node IDs, and conn_spec is unspecified,
~/workspace/nest-simulator/b/lib/python3.6/site-packages/nest/lib/hl_api_connection_helpers.py in _process_syn_spec(syn_spec, conn_spec, prelength, postlength)
74 raise kernel.NESTError(
75 "'" + key + "' has to be an array of "
---> 76 "dimension " + str(prelength) + ", a "
77 "scalar or a dictionary.")
78 else:
NESTError: 'weight' has to be an array of dimension 100, a scalar or a dictionary.
If we are connective np.arrays, the error should say that 'weight' has to be an array of dimension 100
, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice catch! I've updated the error messages.
Is there a reason against just converting lists or numpy arrays to NodeCollections if such inputs are provided? |
Any NodeID can occur only once in a NodeCollection, while repetitions are permitted in arrays. Support for NumPy arrays is mainly aimed at cases where connections are to be generated from explicitly given connection data where repeated NodeIDs will be rather common. |
Sure, on the other hand, connection to and from devices have a priori no reason to feature duplicate ids, and an error could simply be raised if the length of the converted NodeCollection differs from that of the initial list... |
@Silmathoron I'm not sure this is what you're asking about, but when connecting you can use node IDs of devices as well, e.g.: nest.Create('iaf_psc_alpha', 10) # Node IDs 1-10
nest.Create('spike_detector') # Node ID 11
sources = np.array([1, 5, 7], dtype=np.uint64)
targets = np.array([11, 11, 11], dtype=np.uint64)
weights = np.ones(len(sources))
delays = np.ones(len(sources))
nest.Connect(sources, targets, syn_spec={'weight': weights, 'delay': delays,
'synapse_model': 'static_synapse'}) |
@hakonsbm @heplesser
import nest
import numpy as np
# create neurons and devices
neuron = nest.Create("iaf_psc_alpha", 10)
vm = nest.Create("voltmeter", 10)
# set weights
weights = np.ones(len(neuron))
delays = np.ones(len(neuron))
# numpy version
arr_vm = np.array(vm.tolist(), dtype=int)
arr_nrn = np.array(neuron.tolist(), dtype=int)
# works
nest.Connect(arr_vm, arr_nrn, syn_spec={'weight': weights, 'delay': delays, 'synapse_model': 'static_synapse'})
# does not work
nest.Connect(arr_vm, neuron.tolist(), syn_spec={'weight': weights, 'delay': delays, 'synapse_model': 'static_synapse'})
nest.Connect(vm.tolist(), neuron.tolist(), syn_spec={'weight': weights, 'delay': delays, 'synapse_model': 'static_synapse'})
nest.Connect(arr_vm, arr_nrn)
nest.Connect(arr_vm, neuron, syn_spec={'weight': weights, 'delay': delays, 'synapse_model': 'static_synapse'})
nest.Connect(arr_vm, neuron)
nest.Connect(vm.tolist(), neuron) the way I see it, requiring |
I stressed a bit the new function :-) If I try to connect two NumPy arrays where pre and/or post neuronshave a single element, e.g.:
I get
Another similar error in the parameters provided can be the following one:
I added check for the first case and slightly modified the error messages for the second case. |
…from_data small changes for managing 0-dimensional inputs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the effort!
I am now reviewing the related PR #1485
This fixes #1423 and HBP Ticket #482997. |
Connections are created one-to-one based on NumPy arrays of source and target node IDs, and a synapse specification containing NumPy arrays of weights and delays. All arrays have to be of equal length. For efficiency, the PyNEST kernel bypasses SLI and passes a pointer for each array to the C++ level.
It can be used as in the following example:
This replaces the implementation of
Connect_nonunique
.Fixes #1423.