Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[TODO] fix: BCELoss #17

Closed
L-M-Sherlock opened this issue Aug 22, 2023 · 0 comments · Fixed by #18
Closed

[TODO] fix: BCELoss #17

L-M-Sherlock opened this issue Aug 22, 2023 · 0 comments · Fixed by #18
Labels
invalid This doesn't seem right

Comments

@L-M-Sherlock
Copy link
Member

The current loss function is CrossEntropyLoss, which has already included a softmax layer inside. Its input is expected to contain the unnormalized logits for each class.

However, FSRS model's output is the stability, which is used to calculate the retention at certain delta_t. The retention is a form of probability. The softmax layer inside the CrossEntropyLoss will process the retention in an unexpected way. So the weights couldn't be optimized as expected.

Reference:

@L-M-Sherlock L-M-Sherlock added the invalid This doesn't seem right label Aug 22, 2023
@L-M-Sherlock L-M-Sherlock linked a pull request Aug 22, 2023 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
invalid This doesn't seem right
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant