Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Removed a wrong key-word argument in sigmoid_focal_loss() function call #31951

Merged

Conversation

Sai-Suraj-27
Copy link
Contributor

What does this PR do?

The reduction argument passed in the following function call is wrong. We can see from the function definition of sigmoid_focal_loss there is no parameter named reduction:

loss = sigmoid_focal_loss(src_logits, target, self.alpha, self.gamma, reduction="none")

def sigmoid_focal_loss(inputs, targets, num_boxes, alpha: float = 0.25, gamma: float = 2):
"""
Loss used in RetinaNet for dense detection: https://arxiv.org/abs/1708.02002.
Args:
inputs (`torch.FloatTensor` of arbitrary shape):
The predictions for each example.
targets (`torch.FloatTensor` with the same shape as `inputs`)
A tensor storing the binary classification label for each element in the `inputs` (0 for the negative class
and 1 for the positive class).
alpha (`float`, *optional*, defaults to `0.25`):
Optional weighting factor in the range (0,1) to balance positive vs. negative examples.
gamma (`int`, *optional*, defaults to `2`):
Exponent of the modulating factor (1 - p_t) to balance easy vs hard examples.
Returns:
Loss tensor
"""

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

@ArthurZucker @amyeroberts

Copy link
Collaborator

@amyeroberts amyeroberts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing!

@amyeroberts amyeroberts merged commit 454bc14 into huggingface:main Jul 15, 2024
18 checks passed
zucchini-nlp pushed a commit to zucchini-nlp/transformers that referenced this pull request Jul 16, 2024
…tion call (huggingface#31951)

Removed a wrong key-word argument in sigmoid_focal_loss() function call.
amyeroberts pushed a commit to amyeroberts/transformers that referenced this pull request Jul 19, 2024
…tion call (huggingface#31951)

Removed a wrong key-word argument in sigmoid_focal_loss() function call.
MHRDYN7 pushed a commit to MHRDYN7/transformers that referenced this pull request Jul 23, 2024
…tion call (huggingface#31951)

Removed a wrong key-word argument in sigmoid_focal_loss() function call.
zucchini-nlp pushed a commit to zucchini-nlp/transformers that referenced this pull request Jul 24, 2024
…tion call (huggingface#31951)

Removed a wrong key-word argument in sigmoid_focal_loss() function call.
itazap pushed a commit that referenced this pull request Jul 25, 2024
…tion call (#31951)

Removed a wrong key-word argument in sigmoid_focal_loss() function call.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants