-
Notifications
You must be signed in to change notification settings - Fork 143
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add rf and rf_iz neurons to lava #378
Conversation
I have been running into the following error when I try to run During handling of the above exception, another exception occurred: Traceback (most recent call last): Does this command only work on linux? |
…es the same method
It seems not to work with Windows. You can look at the linting issues here: https://github.com/lava-nc/lava/actions/runs/3204188629/jobs/5235186915 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @Michaeljurado24, thanks for the PR. I have commented on my thoughts on the code and suggested some changes.
BTW @Michaeljurado24, can you email me, I think it's better we have a 1:1 meeting. |
…s. Fixed formatting
Sure thing. I could not find your email so I sent you a 'connect' request on LinkedIn with my email so we can collaborate more closely on this |
…st code for rf floating point process model
…ntly hanging after merge from main
I am not exactly sure why the RF neuron no decay test is failing for Ubuntu rn since i tried it on a Ubuntu machine and it passed... I have a suspicion it is caused by a slight floating point difference and will look into it. |
I saw that @bamsumit accepted the PR 4 days ago. |
An approval just means that the feature and the code quality is okay. All unit tests must still pass, before we can merge the PR. I restarted the unit tests. |
… unit test more robust
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good. Thank you for your contribution.
Thank you all as well for the help and for answering both my logistical and technical questions. |
* Added code for floating point rf and rf_iz neurons. Need to move toward fixed point * Fixed RF process. * Added fixed point for rf neurons. Floating point rf behavior looks weird * Added modified rf activation for floating point * fixed fixed_pt rf_iz neuron * moved location of scale threshold function to be in line with lif * moving scale_threshold back to abstract rf model class since rf_iz uses the same method * added state_exp. Removed most of the deepcopies. Removed some defaults. Fixed formatting * removed accidental print statement * Refactored rf and rf_iz class to look like sdn. Added comments and test code for rf floating point process model * Cleaned up test_resonator_process.py * Added fixed point unit tests for rf and rf_iz * Lava implementation of RF neurons matches lava-dl. RF unit test currently hanging after merge from main * Removed temporary testing files. Fixed liniting errors * Fixed typo * Clarified comments * increase specificity of unit tests * Added comments and cleaned up test code. Added additional fixed point test for rf neurons * Rf unit test fails to spike due to floating point error on some machines. Fixed the error * Increase magnitude of input to rf float no decay unit test to encourage spiking * Made floating point rf test possibly more robust to floating point errors * fixed linting errors * Removed useless abs() op in unit test. Added small enhacement to make unit test more robust * Slight change to test float no decay to test for periodic spiking * Add abs value to statement checking for proper spike periodicity for rf neurons * Simplify unit test and see if it passes run ci Co-authored-by: Jurado <[email protected]> Co-authored-by: PhilippPlank <[email protected]>
Issue Number:
Objective of pull request: Add rf and rf_iz neurons to lava
Pull request checklist
Your PR fulfills the following requirements:
flakeheaven lint src/lava tests/
) and (bandit -r src/lava/.
) pass locallypytest
) passes locallyPull request type
Please check your PR type:
What is the current behavior?
-No support for rf neurons in lava
What is the new behavior?
-There should be a floating point and loihi bit accurate implementation of rf neurons in lava. This implementation should match the lava-dl rf neuron implementations.
Does this introduce a breaking change?
Supplemental information
This issue is a result of an ongoing discussion in which I ask whether or not rf neurons will be added to lava.
Please see proof of concept example here