Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

qa not calculated correctly in backpropagation #18

Open
andreicscs opened this issue Oct 1, 2024 · 0 comments
Open

qa not calculated correctly in backpropagation #18

andreicscs opened this issue Oct 1, 2024 · 0 comments

Comments

@andreicscs
Copy link

andreicscs commented Oct 1, 2024

nn.h
row 419: float qa = dactf(a, NN_ACT);

"qa" the output of the activation function's derivative should be derived from the pre-activation output (o or z) instead of directly using the activated output. This is crucial because the derivative of the activation function must be computed with respect to the input to that function, which is the pre-activation value.
image
image

therefore in the current code the chainrule is not applied correctly

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant