Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generate: fix ExponentialDecayLengthPenalty doctest #27485

Merged
merged 1 commit into from
Nov 14, 2023

Conversation

gante
Copy link
Member

@gante gante commented Nov 14, 2023

What does this PR do?

The doctest was failing... but the root cause for the failure in transformers.generation.logits_process.ExponentialDecayLengthPenalty is the transition from torch==2.0 to torch==2.1 (i.e. installing torch==2.0 fixes it).

I couldn’t find any related reference in the release notes, but I know they touched torch.multinomial after the 2.0 release -- it was sampling things with probability=0.0. This change may impact this doctest, as it is sample-based and it may induce probabilities=0.0.

As such, the fix consists of updating the test's outputs. I took the opportunity to improve the example as well :)

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Nov 14, 2023

The documentation is not available anymore as the PR was closed or merged.

Copy link
Collaborator

@amyeroberts amyeroberts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing, the investigation and description!

Copy link
Collaborator

@ydshieh ydshieh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great, thanks a lot !

@ydshieh
Copy link
Collaborator

ydshieh commented Nov 14, 2023

The failing test

src/transformers/generation/logits_process.py::transformers.generation.logits_process.WhisperTimeStampLogitsProcessor

is known on doctest CI, so good for me to merge. I already ping @sanchit-gandhi on slack.

@gante
Copy link
Member Author

gante commented Nov 14, 2023

@amyeroberts would you be able to merge this PR, as the failure is unrelated? (different doctest on the same file) :)

@ydshieh
Copy link
Collaborator

ydshieh commented Nov 14, 2023

I can do it, but maybe wait @amyeroberts this time (so she knows I can 😄 without surprise)

@amyeroberts
Copy link
Collaborator

I can merge!

cc @sanchit-gandhi as the failing doctest is a whisper one.

@amyeroberts amyeroberts merged commit fe472b1 into huggingface:main Nov 14, 2023
6 of 8 checks passed
@gante gante deleted the exponential_logits_fix branch November 14, 2023 18:40
Saibo-creator pushed a commit to epfl-dlab/transformers-GCD-PR that referenced this pull request Nov 15, 2023
wgifford pushed a commit to namctin/transformers that referenced this pull request Nov 17, 2023
EduardoPach pushed a commit to EduardoPach/transformers that referenced this pull request Nov 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants