-
Notifications
You must be signed in to change notification settings - Fork 5.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implemented binary_cross_entropy_with_logits in paddle.nn.functional.loss #17033
Implemented binary_cross_entropy_with_logits in paddle.nn.functional.loss #17033
Conversation
Result from paddle ground truth and other backend differs, so I set |
@KevinUli How do they differ? If you explicitly set |
@xoiga123 When running the function for torch and paddle,
Getting rid of the pos_weight kwarg returns the same value for all frameworks |
@KevinUli Yeah seems weird, I'll look into source and see what I can find. Thank you |
@KevinUli Yeah clearly the pytorch's implementation of # TODO: paddle's implementation of pos_weight is wrong
# https://github.com/PaddlePaddle/Paddle/blob/f0422a28d75f9345fa3b801c01cd0284b3b44be3/python/paddle/nn/functional/loss.py#L831 And then proceed to open an issue on their side, or open a PR to fix it if you want to. I'll merge this PR but please keep track of when it's fixed and add |
I've opened an issue on their side. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well done 👍
Close #16741