Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use head_dim if in config for RoPE #32495

Merged
merged 3 commits into from
Aug 16, 2024

Conversation

suiyoubi
Copy link
Contributor

@suiyoubi suiyoubi commented Aug 7, 2024

What does this PR do?

Fixes issue that when head_dim * num_attention_heads != hidden_size, the RoPE is not initialized correctly

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM that looks like a regression (for llama at least) cc @gante !

@ArthurZucker ArthurZucker requested a review from gante August 7, 2024 13:50
@suhara suhara mentioned this pull request Aug 7, 2024
5 tasks
@suiyoubi
Copy link
Contributor Author

suiyoubi commented Aug 8, 2024

@gante could you please take a look at this PR ?

@suiyoubi
Copy link
Contributor Author

Hi @ArthurZucker @gante , may I ask if there is any update on this ?

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's simplify and good to go otherwise 🤗

@@ -58,7 +58,8 @@ def _compute_default_rope_parameters(
elif config is not None:
base = config.rope_theta
partial_rotary_factor = config.partial_rotary_factor if hasattr(config, "partial_rotary_factor") else 1.0
dim = int((config.hidden_size // config.num_attention_heads) * partial_rotary_factor)
head_dim = config.head_dim if hasattr(config, "head_dim") else config.hidden_size // config.num_attention_heads
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
head_dim = config.head_dim if hasattr(config, "head_dim") else config.hidden_size // config.num_attention_heads
head_dim = getattr(config, "head_dim", config.hidden_size // config.num_attention_heads)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks for catching this! just fixed!

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool thanks for updating

@ArthurZucker ArthurZucker merged commit 5fd7ca7 into huggingface:main Aug 16, 2024
20 checks passed
ArthurZucker pushed a commit that referenced this pull request Aug 16, 2024
* use head_dim if in config for RoPE

* typo

* simplify with getattr
ArthurZucker pushed a commit that referenced this pull request Aug 20, 2024
* use head_dim if in config for RoPE

* typo

* simplify with getattr
ArthurZucker pushed a commit that referenced this pull request Aug 20, 2024
* use head_dim if in config for RoPE

* typo

* simplify with getattr
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants