Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ei network tutorial #309

Merged
merged 32 commits into from
Sep 23, 2022
Merged

Ei network tutorial #309

merged 32 commits into from
Sep 23, 2022

Conversation

ackurth-nc
Copy link
Contributor

Issue Number: #308

Add tutorial on standard network of recurrently connected excitatory and inhibitory neurons

Pull request checklist

Your PR fulfills the following requirements:

  • Issue created that explains the change and why it's needed
  • Tests are part of the PR (for bug fixes / features)
  • Docs reviewed and added / updated if needed (for bug fixes / features)
  • PR conforms to Coding Conventions
  • PR applys BSD 3-clause or LGPL2.1+ Licenses to all code files
  • Lint (flakeheaven lint src/lava tests/) and (bandit -r src/lava/.) pass locally
  • Build tests (pytest) passes locally

Pull request type

Please check your PR type:

  • Bugfix
  • Feature
  • Code style update (formatting, renaming)
  • Refactoring (no functional changes, no api changes)
  • Build related changes
  • Documentation changes
  • Other (please describe):

Does this introduce a breaking change?

  • Yes
  • No

Supplemental information

I also made a a diagram for the created Process. What's the workflow on adding this

Copy link
Contributor

@awintel awintel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. Just a few remarks:

  • "how with Lava you can implement" sounds like a German has written that ;-)
  • Remove "add diagram" comment or add the diagram
  • "From bird's-eye view, an ..." -> "From a birds-eye..."
  • "Additionaly, we here require a separation..." -> "Additionally, we require..."
  • Both your EINetwork Process and SubProcModel define defaults for the Vars but different ones. This seems unnecessary.
  • If I remember correctly, SubProbModels don't need a protocol specification. This gets ignored and inherited from the actually instantiated sub processes.
  • Maybe say why you use both RingBuffers as source and sink... that might be surprising.
  • You could consider putting the EINetwork and Ringbuffer setup in a function such that you don't have to show the same code twice in your different experiments.

The Loihi version of the tutorial should remain in intel-collab. However, the main tutorial may refer to it.

Thanks

Copy link
Contributor

@weidel-p weidel-p left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this nice tutorial, I think it's ready to be published. I added two minor comments complementing those by AW.

@PhilippPlank PhilippPlank added the area: tutorials Issues with something in lava/tutorials label Aug 31, 2022
@ackurth-nc
Copy link
Contributor Author

ackurth-nc commented Sep 19, 2022

The E/I tutorial has been improved now including:

  • a E/I network lava process with a new, simplified interface
  • multiple ProcessModels for the E/I network including implementations with rate and LIF neurons
  • a preliminary mapping function of floating-point to fixed-point parameters
  • a simulation of the E/I network with bit-accurate LIF and Dense ProcessModels and a comparison with the floating-point one

The last part treating bit-accurate model can be removed it deemed unnecessary or if thought not to be polished enough.

@ackurth-nc ackurth-nc requested review from awintel and removed request for drager-intel September 19, 2022 19:51
Copy link
Contributor

@mathisrichter mathisrichter left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks very good and polished. I gave some detailed feedback on some portions of code though, somewhat minor points.

There are some more general points that require attention though.

  • None of the code seems to be unit tested. Code that does not have automated unit tests should usually not pass code reviews. I would expect unit tests at least for the function that converts rate-code parameters to LIF parameters. Make sure that you think through any qualitatively different cases of input and write tests for those. What happens, for instance, if the user does not supply shape_exc or misspells it? I would usually mark my review as 'Request changes' for now but will approve it given the upcoming release. If you can add in tests, that would be great. For future development, please consider adopting test-driven development as a habit.
  • The code is missing type annotations throughout. Please consider getting into the habit of adding type annotations to any code you are writing.
  • Following the Lava developer guidelines, all comments should be sentences that begin with a capital letter and end in a full stop.

tutorials/end_to_end/convert_params.py Show resolved Hide resolved
tutorials/end_to_end/convert_params.py Outdated Show resolved Hide resolved
tutorials/end_to_end/convert_params.py Outdated Show resolved Hide resolved
tutorials/end_to_end/convert_params.py Outdated Show resolved Hide resolved
tutorials/end_to_end/convert_params.py Outdated Show resolved Hide resolved
Copy link
Contributor

@tim-shea tim-shea left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comments on tutorial:
The intro text at top is too long. The bold "First aim", "secondly", and "finally" can all be cut. Recommend changing "bird's eye view" to "high level". Recommend simplifying/shortening "E/I Network".

All imports should be collected at the top unless they are conditional or you are going to spend some time explaining something about them.

Recommend factoring generate_gaussian_weights out, since this is a fundamental component of what an E/I net is, it should be properly introduced and described.

"Defining the parameters..." text is too long.

The text below "Visualizing the network" does not match my impression from the plot. Please try to explain what I am seeing in the plot with an outsider's perspective.

The covariance analysis feels too academic and unmotivated. For both this, and the previous plot, maybe show two side-by-side plots such that you can illustrate to the developer what is different or what question could be answered through this analysis.

"chaotical" > "chaotic"

Note that the chaotic network also seems to be drifting at a macro-level. If you run this for longer, you may see more interesting behavior that could be analysed. In addition, both network state plots could be improved by plotting fewer overlapping neuron states. E.g. plot just the first 25 dimensions.

CustomRunConfigFloat is not explained and the "if isinstance" pattern is an anti-pattern. Can't you accomplish this selection using tags without overriding select?

The text in "Self-balancing of the network" is too academic.

In the small vs high weights plot, one of these labels should be changed (either small vs large, or low vs high).

Typo in "Runnig a process model"

Overall and most importantly, each individual stage of analysis seems technically fine, but there doesn't seem to be any clear, driving purpose to this tutorial. I think it would strongly benefit from thinking about "what is the one main thing we are trying to teach people with this tutorial?" and then structuring the stages to progressively work towards that goal.

tutorials/end_to_end/convert_params.py Outdated Show resolved Hide resolved
@ackurth-nc
Copy link
Contributor Author

ackurth-nc commented Sep 22, 2022

Thanks, Tim, for the thorough review.

I addressed some of the remarks in the commit c53222a.

Comments on tutorial: The intro text at top is too long. The bold "First aim", "secondly", and "finally" can all be cut. Recommend changing "bird's eye view" to "high level". Recommend simplifying/shortening "E/I Network".

Followed your recommendations.

All imports should be collected at the top unless they are conditional or you are going to spend some time explaining something about them.

I of course agree for a single script. Is this also best practice in a tutorial? In the other tutorials it is done as in this.

Recommend factoring generate_gaussian_weights out, since this is a fundamental component of what an E/I net is, it should be properly introduced and described.

Done.

"Defining the parameters..." text is too long.

Adapted text.

The text below "Visualizing the network" does not match my impression from the plot. Please try to explain what I am seeing in the plot with an outsider's perspective.

Adapted text.

The covariance analysis feels too academic and unmotivated. For both this, and the previous plot, maybe show two side-by-side plots such that you can illustrate to the developer what is different or what question could be answered through this analysis.

I adapted the text.

"chaotical" > "chaotic"

Done.

Note that the chaotic network also seems to be drifting at a macro-level. If you run this for longer, you may see more interesting behavior that could be analysed. In addition, both network state plots could be improved by plotting fewer overlapping neuron states. E.g. plot just the first 25 dimensions.

Changed the plotting to plot fewer neurons.

CustomRunConfigFloat is not explained and the "if isinstance" pattern is an anti-pattern. Can't you accomplish this selection using tags without overriding select?

I wrote a sentence for the CustomRunConfig. Regarding you question: I would not know how to do this. When I select a tag, say as in the tutorial 'lif_neurons' for the ProcessModel of the EINetwork and the RingBuffer has no implementation for this tag, we get problems I think. But maybe I am mistaken! Furthermore, I want to also specifically choose my LIF and Dense ProcessModel (for the bit-accurate runs later). The code snippet for the CustomRunConfig comes from here.

The text in "Self-balancing of the network" is too academic.

Adapted the text a bit. If you think it does not fit at all I can also remove it.

In the small vs high weights plot, one of these labels should be changed (either small vs large, or low vs high).

Done.

Typo in "Runnig a process model"

Done.

Overall and most importantly, each individual stage of analysis seems technically fine, but there doesn't seem to be any clear, driving purpose to this tutorial. I think it would strongly benefit from thinking about "what is the one main thing we are trying to teach people with this tutorial?" and then structuring the stages to progressively work towards that goal.

I see what you mean. The main purpose of the tutorial, to my understanding, is showcasing how to work with different process models that implement different behavior but still share computational similarities. To see this, I need the analysis,
For this reason there is also a unified interface for the rate and LIF network as well as the conversion functionality.
Does this make it a bit clearer?

Copy link
Contributor

@awintel awintel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good!

Names are inconsistent:
def init(self, proc):
super().init(proc_params=proc)
It's either a proc or a proc_param but a proc shouldn't turn into proc_params.

Typo:
hot to define and select

Inconsistency:
Some single line comments end with a full stop, other don't.

Why do we not quantitatively compare the rate, float-LIF and fixed-LIF models?

@ackurth-nc
Copy link
Contributor Author

ackurth-nc commented Sep 23, 2022

Thanks for the review, I addressed your points in f2793b7.

Names are inconsistent: def init(self, proc): super().init(proc_params=proc) It's either a proc or a proc_param but a proc shouldn't turn into proc_params.

Great catch! I made it consistent.

Typo: hot to define and select

Fixed typo.

Inconsistency: Some single line comments end with a full stop, other don't.

Made it consistent.

Why do we not quantitatively compare the rate, float-LIF and fixed-LIF models?

It is not directly straightforward because the rate in a LIF network the rate is a priori positive but in the rate network the state
can also attain negative values. In a way it's more like a temporal averaged rate.
I added a rate comparison where one can see that the behavior actually is very similar when averaged correctly.
If you think it does not fit in this form, we can also kick it out again.

@mgkwill mgkwill merged commit ea56c86 into lava-nc:main Sep 23, 2022
monkin77 pushed a commit to monkin77/thesis-lava that referenced this pull request Jul 12, 2024
* Add excitatory inhibitory neural network tutorial

* Simplify connections, fix typos

* Added motivation for E/I networks, add figure argument to raster plot

* Add analysis of recurrently generated currents

* Simplify connections

* Test removal of break

* Formatting, change of num_steps

* Formatting

* Add tutorial notebook E/I network

* Add E/I tutorial to tests

* Adapt EI tutorial to higher level tutorial

* Extend E/I network tutorial by rate ProcModel

* Unify interface, add rate to LIF conversion

* Remove unnecessary import and remove commented out code

* Add aliased state variable

* Move conversion math into separate script

* Add experimental float2fixed conversion

* Add float- to fixed-point conversion example

* Update tutorial, remove old tutorial

* Improve plotting comparison float fixed

* Add learn more section

* Fix typos, amend comments, add license

* Address remarks from review

* Simplify LIF E/I Network Proc Model

* Make arguments for conversion function explicit

* Fix typos, make arg and comments consistent, add rate
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: tutorials Issues with something in lava/tutorials
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants