Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PR #18105: [Add Feature] - Throw an error if softmax is used with 1 neuron #18201

Closed
wants to merge 1 commit into from

Conversation

copybara-service[bot]
Copy link

@copybara-service copybara-service bot commented Jun 6, 2023

PR #18105: [Add Feature] - Throw an error if softmax is used with 1 neuron

Imported from GitHub PR #18105

This is a utility function to check if the usage of softmax makes sense (new users make this mistake a lot). Applying softmax on a single neuron will make the model output ones everytime, there are too many Stackoverflow posts about this.

In order to see this in action, please check the gist.

This applies for any other layers (Conv2D etc.) where the applied axis (axis=-1 default) of softmax has only one unit.
Copybara import of the project:

--
90c95b1 by Kaan Bıçakcı [email protected]:

Add last layer activation check for softmax

--
1cedb20 by Kaan Bıçakcı [email protected]:

Split logic for sequential and functional models

--
529f968 by Kaan Bıçakcı [email protected]:

Add tests for _check_last_layer_activation

--
d1acddb by Kaan Bıçakcı [email protected]:

Update sequential check

--
8363016 by Kaan Bıçakcı [email protected]:

Update tests, logic and reformatting

--
ebf16c3 by Kaan Bıçakcı [email protected]:

Update tests and the logic

--
afc156a by Kaan Bıçakcı [email protected]:

Make validate_softmax_activation experimental

--
3a228fb by Kaan Bıçakcı [email protected]:

Fix edge case for _validate_softmax_output

--
e9c950e by Kaan Bıçakcı [email protected]:

Check the softmax axis and raise an error if relevant

--
6355b23 by Kaan Bıçakcı [email protected]:

Update softmax check tests

--
a6745ee by Kaan Bıçakcı [email protected]:

Minor typo fix

--
92281f6 by Kaan Bıçakcı [email protected]:

Fix test fails for _check_output_activation_softmax

--
72a035f by Kaan Bıçakcı [email protected]:

Resolve conflict

--
0af059c by Kaan Bıçakcı [email protected]:

Squashed commit master (merge) to resolve conflicts

--
065cdea by Kaan Bıçakcı [email protected]:

Revert "Squashed commit master (merge) to resolve conflicts"

This reverts commit 0af059c.

--
446f1dd by Kaan Bıçakcı [email protected]:

Remove steps_per_execution_tuning from imports

--
1fbd931 by Kaan Bıçakcı [email protected]:

Fix lint

Merging this change closes #18105

FUTURE_COPYBARA_INTEGRATE_REVIEW=#18105 from Frightera:last_layer_softmax_warn 1fbd931

@copybara-service copybara-service bot changed the title PR #18105: [Add Feature] - Warn user if softmax usage is wrong PR #18105: [Add Feature] - Throw an error if softmax is used with 1 neuron Jun 13, 2023
…euron

Imported from GitHub PR #18105

This is a utility function to check if the usage of softmax makes sense (new users make this mistake a lot). Applying softmax on a single neuron will make the model output ones everytime, there are too many Stackoverflow posts about this.

In order to see this in action, please check [the gist](https://colab.research.google.com/gist/Frightera/fdcec020fff6ee9521ae2fd3eaed774d/checksoftmaxlastlayer.ipynb).

This applies for any other layers (Conv2D etc.) where the applied axis (axis=-1 default) of softmax has only one unit.
Copybara import of the project:

--
90c95b1 by Kaan Bıçakcı <[email protected]>:

Add last layer activation check for softmax

--
1cedb20 by Kaan Bıçakcı <[email protected]>:

Split logic for sequential and functional models

--
529f968 by Kaan Bıçakcı <[email protected]>:

Add tests for _check_last_layer_activation

--
d1acddb by Kaan Bıçakcı <[email protected]>:

Update sequential check

--
8363016 by Kaan Bıçakcı <[email protected]>:

Update tests, logic and reformatting

--
ebf16c3 by Kaan Bıçakcı <[email protected]>:

Update tests and the logic

--
afc156a by Kaan Bıçakcı <[email protected]>:

Make validate_softmax_activation experimental

--
3a228fb by Kaan Bıçakcı <[email protected]>:

Fix edge case for _validate_softmax_output

--
e9c950e by Kaan Bıçakcı <[email protected]>:

Check the softmax axis and raise an error if relevant

--
6355b23 by Kaan Bıçakcı <[email protected]>:

Update softmax check tests

--
a6745ee by Kaan Bıçakcı <[email protected]>:

Minor typo fix

--
92281f6 by Kaan Bıçakcı <[email protected]>:

Fix test fails for _check_output_activation_softmax

--
72a035f by Kaan Bıçakcı <[email protected]>:

Resolve conflict

--
0af059c by Kaan Bıçakcı <[email protected]>:

Squashed commit master (merge) to resolve conflicts

--
065cdea by Kaan Bıçakcı <[email protected]>:

Revert "Squashed commit master (merge) to resolve conflicts"

This reverts commit 0af059c.

--
446f1dd by Kaan Bıçakcı <[email protected]>:

Remove steps_per_execution_tuning from imports

--
1fbd931 by Kaan Bıçakcı <[email protected]>:

Fix lint

Merging this change closes #18105

FUTURE_COPYBARA_INTEGRATE_REVIEW=#18105 from Frightera:last_layer_softmax_warn 1fbd931
PiperOrigin-RevId: 538223534
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant