Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add fallback to load state_dict with strict=False #398

Merged

Conversation

adrianeboyd
Copy link
Contributor

Description

Due to incompatibilities related to state_dict keys between transformers v4.30 and v4.31, fall back to loading with strict=False.

Types of change

Enhancement.

Checklist

  • I confirm that I have the right to submit this contribution under the project's MIT license.
  • I ran the tests, and all new and existing tests passed.
  • My changes don't require a change to the documentation, or if they do, I've added all required information.

Due to incompatibilities related to `state_dict` keys between
`transformers` v4.30 and v4.31, fall back to loading with
`strict=False`.
@adrianeboyd adrianeboyd force-pushed the feature/torch-load-strict-backoff branch from ee26798 to fcd7e73 Compare August 11, 2023 14:00
@adrianeboyd
Copy link
Contributor Author

Okay, this is a lot less breaking than v1.3.0. This could probably use a bit more spot testing on different platforms and with a wider range of models.

@adrianeboyd adrianeboyd added the enhancement New feature or request label Aug 11, 2023
Copy link

@rmitsch rmitsch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. I haven't looked deeply into spacy-transformers yet though, a second opinion might be useful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants