Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade Transformers to v4.41.x #712

Merged
merged 5 commits into from
Jul 12, 2024
Merged

Upgrade Transformers to v4.41.x #712

merged 5 commits into from
Jul 12, 2024

Conversation

calpt
Copy link
Member

@calpt calpt commented Jun 29, 2024

Changes needed for sync:

  • BERT/ ViT: Copy & adapt new sdpa attention classes
  • Update copied _prepare_encoder_decoder_kwargs_for_generation in model mixin
  • Adjust 2dim attention masks for prompt tuning

@calpt calpt added the sync label Jun 29, 2024
@calpt calpt marked this pull request as ready for review July 6, 2024 10:35
Copy link
Member

@lenglaender lenglaender left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me!

@calpt calpt merged commit dc53695 into adapter-hub:main Jul 12, 2024
4 checks passed
@calpt calpt deleted the sync/v4.41.x branch July 12, 2024 12:57
dainis-boumber added a commit to ReDASers/adapters that referenced this pull request Aug 25, 2024
Changes needed for sync:
- BERT/ ViT: Copy & adapt new sdpa attention classes
- Update copied `_prepare_encoder_decoder_kwargs_for_generation` in
model mixin
- Adjust 2dim attention masks for prompt tuning
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants