Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[docs] Convert weights #345

Merged
merged 3 commits into from
Sep 6, 2023
Merged

[docs] Convert weights #345

merged 3 commits into from
Sep 6, 2023

Conversation

stevhliu
Copy link
Member

@stevhliu stevhliu commented Aug 25, 2023

From internal Slack thread, this PR adds a short guide for converting weights to .safetensors. Let me know if I'm missing anything!

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Aug 25, 2023

The documentation is not available anymore as the PR was closed or merged.

Comment on lines 7 to 23
```py
from transformers import AutoModel

model = AutoModel.from_pretrained(
"my-safe-model", revision="refs/pr/1", use_safetensors=True
)
```

Another way to convert your `.bin` files is to use the [`~safetensors.torch.save_model`] function:

```py
from transformers import AutoModel
from safetensors.torch import save_model

unsafe_model = AutoModel.from_pretrained("my-unsafe-model")
save_model(unsafe_model, "model.safetensors")
```
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
```py
from transformers import AutoModel
model = AutoModel.from_pretrained(
"my-safe-model", revision="refs/pr/1", use_safetensors=True
)
```
Another way to convert your `.bin` files is to use the [`~safetensors.torch.save_model`] function:
```py
from transformers import AutoModel
from safetensors.torch import save_model
unsafe_model = AutoModel.from_pretrained("my-unsafe-model")
save_model(unsafe_model, "model.safetensors")
```

I'm not a big fan of suggesting either way here.

  1. AutoModel will miss the model's head leading to bad models being saved and we cannot invent the correct head for users. Also it requires transformers which safetensors doesn't require.
  2. This should work and is the preferred way for doing it when not using transformers, however, it might cause issues with transformers models because of weight naming (some weights share the same name, safetensors will discard any and everything will work fine, but that might be breaking the names transformers expects to see and clash with their own internals).

Using the space should be the go-to IMO.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

bindings/python/convert.py is basically the code run on the space too.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The space has been crashing/not working for a number of users, which is why I requested a small guide (you can check this discussion thread).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I could remove the logit sanity check, it should work for more models. Still going to be dead slow..

If we could at least add progressbars that'd help but I'm not sure how to actually do that.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@christophe-rannou @co42 Any way we could bump/replicate this space to have more replicas ?
It's chugging pretty much everything it can atm

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should work and is the preferred way for doing it when not using transformers, however, it might cause issues with transformers models because of weight naming (some weights share the same name, safetensors will discard any and everything will work fine, but that might be breaking the names transformers expects to see and clash with their own internals).

Maybe we can add a warning here to let users know they might run into issues if they use the save_model method with transformers and they can try it on their own if they want?

Otherwise, we can just keep the Convert Space option and have a warning that says it may be slower for larger models + point them towards bindings/python/convert.py if they want to convert it locally.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Went with option #2 to only recommend the Convert Space; let me know if this is ok for you @Narsil! 🙂

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM thanks for the ping.

@Narsil Narsil merged commit cc5d941 into main Sep 6, 2023
11 checks passed
@Narsil Narsil deleted the convert-weights-docs branch September 6, 2023 07:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants