Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FIX / Hub: Also catch for exceptions.ConnectionError #31469

Merged
merged 3 commits into from
Aug 22, 2024

Conversation

younesbelkada
Copy link
Contributor

What does this PR do?

Fixes: #30920

Please see the detailed issue description for more details, it seems in transformers we hit a corner case when using HF offline mode + loading a non-safetensors model offline

cc @Wauplin as I am not 100% sure about this and why this is not catched within hf_raise_for_status in huggingface_hub

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@Wauplin
Copy link
Contributor

Wauplin commented Jun 26, 2024

@younesbelkada I'm surprised that a requests.exceptions.ConnectionError can be raised by hf_raise_for_status. In theory, raise_for_status (from Python's requests package) is meant for when a HTTP response has been returned by the server and you want to raise if the server returned an error. A ConnectionError is different and happens before or while getting a response from the server. So to catch it you should do it here IMO.

While you are at it, you can also include read timeout and remove SSLError and ProxyError for which it's better to raise:

        except (requests.exceptions.SSLError, requests.exceptions.ProxyError):
            # Actually raise for those subclasses of ConnectionError
            raise
        except (
            requests.exceptions.ConnectionError,
            requests.exceptions.Timeout,
            OfflineModeIsEnabled,
        ):
            return has_file_in_cache

This is what we do in huggingface_hub download method.

Copy link

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

@Wauplin
Copy link
Contributor

Wauplin commented Jul 22, 2024

@younesbelkada @huggingface/transformers-core-maintainers I still think this PR is worth updating + merging (see previous comment #31469 (comment)).

@amyeroberts
Copy link
Collaborator

@Wauplin I updated following your comments, but not 100% I'm catching in the right way. LMK!

Copy link
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for getting back to it @amyeroberts . I had a fresh look at it and I think the changes should be as suggested below. First we do the call and check for offlinemode/connection error/timeout error. And then check for httperror independently

src/transformers/utils/hub.py Outdated Show resolved Hide resolved
src/transformers/utils/hub.py Outdated Show resolved Hide resolved
Copy link
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@amyeroberts
Copy link
Collaborator

@Wauplin Thanks for writing the correct code! As we both contributed to this - let's get a final pair of eyes to double check this is OK

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks, this was causing some confusing indeed

@ArthurZucker ArthurZucker merged commit eeea712 into main Aug 22, 2024
20 of 22 checks passed
@ArthurZucker ArthurZucker deleted the younesbelkada-patch-2 branch August 22, 2024 13:29
zucchini-nlp pushed a commit to zucchini-nlp/transformers that referenced this pull request Aug 30, 2024
…1469)

* Update hub.py

* Update errors

* Apply suggestions from code review

Co-authored-by: Lucain <[email protected]>

---------

Co-authored-by: Amy Roberts <[email protected]>
Co-authored-by: Lucain <[email protected]>
zucchini-nlp pushed a commit to zucchini-nlp/transformers that referenced this pull request Aug 30, 2024
…1469)

* Update hub.py

* Update errors

* Apply suggestions from code review

Co-authored-by: Lucain <[email protected]>

---------

Co-authored-by: Amy Roberts <[email protected]>
Co-authored-by: Lucain <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] Offline loading of non-safe tensors fails
5 participants