Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More accurate softlif in tensorflow #45

Merged
merged 2 commits into from
Jul 13, 2018

Conversation

hunse
Copy link
Collaborator

@hunse hunse commented Jul 3, 2018

This is a change that I also made in nengo_extras (see here).

Basically, for small j, z = log(1 + exp(j/sigma))*sigma = sigma*exp(j/sigma) since log(1 + x) = x for small x. Then, log(1 + 1/z) = log((z + 1) / z) = log(1 / z) since z is also small and z + 1 = 1. Finally, log(1/z) = -log(z) = -log(sigma*exp(j/sigma)) = -j/sigma - log(sigma).

@drasmuss
Copy link
Member

Pushed some minor fixups to the comments, as well as some updates I made to benchmarks.py while I was testing this. For reference, this seems to result in about a 5-10% performance hit in training/inference on the benchmarks, but that seems like a reasonable price to pay for the increased accuracy.

@drasmuss drasmuss force-pushed the softlif-tf-accuracy branch 2 times, most recently from 22023a4 to 67756e7 Compare July 13, 2018 13:09
@drasmuss drasmuss merged commit 121c6fd into nengo:master Jul 13, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

2 participants