-
Notifications
You must be signed in to change notification settings - Fork 5.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: implemented hardsilu activation function #26847
Conversation
Thank you for this PR, here is the CI results: This pull request does not result in any additional test failures. Congratulations! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PR Compliance Checks Passed!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, thanks for contributing. Please remove all of the files inside .idea
, they are not to be merged.
Hi, please don't delete the |
Hello, I think this PR is unactive for a little while, so I'll close it for now. If you want to continue working on it in the future please feel free to open it and start working. Thanks and happy contributing 😊 |
PR Description
I have implemented the hardsilu activation for tensorflow, numpy, torch, paddle, and jax. See the attached image below showing the test results in PyCharm.
Only issue is that the jax test is not passing, but this happens with other existing activations in the repo because I'm getting the KeyError issue. I tried versions lower than 0.4.18 and they all give another error mentioning something about circular import. Note that I'm using a conda environment due to issues with Docker. I asked for help on Discord but it seems no one knows how to address any of these issues yet.
Related Issue
Close #26741
Checklist