Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make using safetensors files automated. #27571

Merged
merged 16 commits into from
Dec 1, 2023
Merged

Make using safetensors files automated. #27571

merged 16 commits into from
Dec 1, 2023

Conversation

Narsil
Copy link
Contributor

@Narsil Narsil commented Nov 17, 2023

If use_safetensors=True is used, and it doesn't exist:

  • Don't crash just yet
  • Lookup for an open PR containing it.
  • If yes, use that instead
  • If not, touch the space to convert, wait for conversion to be finished
    and the PR to be opened
  • Use that new PR
  • Profit.

What does this PR do?

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

If `use_safetensors=True` is used, and it doesn't exist:

- Don't crash just yet
- Lookup for an open PR containing it.
- If yes, use that instead
- If not, touch the space to convert, wait for conversion to be finished
  and the PR to be opened
- Use that new PR
- Profit.
@Narsil Narsil marked this pull request as draft November 17, 2023 19:19
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

First of all it's very cool, thanks for working on it! It gets the job done. I left mostly random nits, but here's the gist of what I think:

In terms of the implementation/what can be accepted in transformers:

  • the specific module looks good to me.
  • I would relax the websockets requirement in favor of requests

General workflow

  • Right now it asks for a token and opens a PR under the token's owner (https://huggingface.co/lysandre/ctrl-clone-2/discussions/1). I'd much favor if it was opened from the bot's perspective. Having the PR be opened by a third party makes me uneasy as you have no certainty it was actually opened through this workflow.
  • I think we might have an issue with sharded checkpoints, will need to check

src/transformers/safetensors_conversion.py Outdated Show resolved Hide resolved
src/transformers/safetensors_conversion.py Outdated Show resolved Hide resolved
src/transformers/safetensors_conversion.py Outdated Show resolved Hide resolved
src/transformers/safetensors_conversion.py Outdated Show resolved Hide resolved
src/transformers/modeling_utils.py Outdated Show resolved Hide resolved
src/transformers/safetensors_conversion.py Show resolved Hide resolved
Comment on lines 11 to 23
def previous_pr(api: "HfApi", model_id: str, pr_title: str) -> Optional["Discussion"]:
try:
main_commit = api.list_repo_commits(model_id)[0].commit_id
discussions = api.get_repo_discussions(repo_id=model_id)
except Exception:
return None
for discussion in discussions:
if discussion.status == "open" and discussion.is_pull_request and discussion.title == pr_title:
commits = api.list_repo_commits(model_id, revision=discussion.git_reference)

if main_commit == commits[1].commit_id:
return discussion
return None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can implem a Hub API to replace this whole function IMO cc @SBrandeis

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Courtesy of @SBrandeis (huggingface/huggingface_hub#1845):

for discussion in get_repo_discussions(
    repo_id="openai/whisper-large-v3",
    author="sanchit-gandhi",
    discussion_type="pull_request",
    discussion_status="open",
):
    ...

Copy link
Member

@LysandreJik LysandreJik Nov 24, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Simon and Lucain! Applied in d32a18e

* Websocket -> SSE

* Support sharded + tests +cleanup

a

* env var
Copy link
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a few comments (mostly from #27656 where I started to put comments and forgot to click "finish your review"... 😬). Nice implementation to handle SSE and test it end to end!

tests/test_modeling_utils.py Outdated Show resolved Hide resolved
tests/test_modeling_utils.py Outdated Show resolved Hide resolved
tests/test_modeling_utils.py Outdated Show resolved Hide resolved
src/transformers/safetensors_conversion.py Outdated Show resolved Hide resolved
src/transformers/safetensors_conversion.py Outdated Show resolved Hide resolved
src/transformers/safetensors_conversion.py Outdated Show resolved Hide resolved
@LysandreJik LysandreJik marked this pull request as ready for review November 24, 2023 14:55
@LysandreJik LysandreJik changed the title [WIP] Make using safetensors files automated. Make using safetensors files automated. Nov 24, 2023
@LysandreJik
Copy link
Member

It's starting to look good! I've tried it with Intel/neural-chat-7b-v3-1, a 7b model, and it took ~3 minutes between when I sent the initial request and when it started downloading the safetensors checkpoint. The first minute was spent waiting for previous conversions to wrap up.

Is there a possibility for us to speed this up even more/parallelize it?

@LysandreJik
Copy link
Member

@ArthurZucker can you take a look when you have a second?

@ArthurZucker
Copy link
Collaborator

Reviewing now

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Super nice. Love how our tools interact so well together thanks all 🤗

src/transformers/modeling_utils.py Outdated Show resolved Hide resolved
src/transformers/safetensors_conversion.py Show resolved Hide resolved
src/transformers/safetensors_conversion.py Show resolved Hide resolved
src/transformers/safetensors_conversion.py Show resolved Hide resolved
src/transformers/safetensors_conversion.py Outdated Show resolved Hide resolved
tests/test_modeling_utils.py Show resolved Hide resolved
tests/test_modeling_utils.py Show resolved Hide resolved
tests/test_modeling_utils.py Show resolved Hide resolved
Co-authored-by: ArthurZucker <[email protected]>
@LysandreJik
Copy link
Member

Thanks for the reviews, merging!

@LysandreJik LysandreJik merged commit 7b6324e into main Dec 1, 2023
23 checks passed
@LysandreJik LysandreJik deleted the auto_safetensors branch December 1, 2023 14:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants