-
Notifications
You must be signed in to change notification settings - Fork 144
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable Sparse processes #672
Conversation
…o dev/sparse_proc � Conflicts: � tests/lava/proc/sparse/test_model.py
…o dev/sparse_proc
…o dev/sparse_proc � Conflicts: � tests/lava/proc/sparse/test_model.py
…o dev/sparse_proc � Conflicts: � tests/lava/proc/sparse/test_model.py
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tests pass, fantastic work! Thank you very much!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is such a huge PR that any reviewer will have a very hard time comprehending, let alone judging it. I made an effort to point out minor things that I noticed along the way but got overwhelmed by the amount of code. I have a feeling that we should be testing many more variations of the user-level API. The new sparse Processes have so many parameters that should be really be exercised more in tests.
* added Sparse proc and init test * added test and implementatin for sparse proc model in floating precision * test for graded spikes * added bit acc version and adapted weight utils * typehints * integer weights for fixed point tests * draft learning * draft get * refactoring tests * avoid saving var in varmodel * draft set * Sparse get/set working on CPU * delay sparse process + test * change order of weights * minor fix for complete sparse matrix * minor fix * delay sparse model + test * get/set for float/fixed * delay sparse fix for int input * LearningSparse floating-pt version + test * all tests for delay dense also run for delay sparse * update test naming * use dot product * make learning float work * fixed pt learning * rm bit approx version * lint * added learning dense for bit approx * improve calculation of wgt_dly * tests for dt and dd * lint * avoid warnings * improve getting the zero matrix * lint * minor changes * improve documentation * minor change * changes to comments * improve documentation * improve documentation * minor change --------- Co-authored-by: gkarray <[email protected]> Co-authored-by: SveaMeyer13 <[email protected]> Co-authored-by: PhilippPlank <[email protected]>
Issue Number:
#656 #636
Objective of pull request:
This PR enables connectivity using Sparse weight matrices. It also includes handling Delays and Learning in floating and bit-approximate versions.
Pull request checklist
Your PR fulfills the following requirements:
flakeheaven lint src/lava tests/
) and (bandit -r src/lava/.
) pass locallypytest
) passes locallyPull request type
Please check your PR type:
What is the current behavior?
What is the new behavior?
Does this introduce a breaking change?
Supplemental information