Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New transformers caching ETA now v4.38 #1348

Conversation

BenjaminBossan
Copy link
Member

@BenjaminBossan BenjaminBossan commented Jan 11, 2024

See #1252 for more context.

The initial idea was for transformers v4.37 to add the new caching to all architectures, but this was postponed to v4.38. The code needs to be adapted for prompt tuning not to break when transformers v4.37 is released.

Big thanks to Joao for the heads up.

See huggingface#1252 for more context.

The initial idea was for transformers 4.37 to add the new caching to all
architectures, but this was postponed to 4.38. The code needs to be
adapted for prompt tuning not to break when transformers 4.37 is
released.
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Contributor

@younesbelkada younesbelkada left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense, thanks!

Copy link
Contributor

@pacman100 pacman100 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hello, please see #1352 which should be merged before this PR.

@BenjaminBossan BenjaminBossan merged commit 71585d6 into huggingface:main Jan 12, 2024
14 checks passed
@BenjaminBossan BenjaminBossan deleted the postpone-transformers-switch-caching branch January 12, 2024 10:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants