Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

position_ids is now required by FalconRotaryEmbedding forward method #1430

Closed
iosonopersia opened this issue Oct 5, 2023 · 1 comment · Fixed by #1431
Closed

position_ids is now required by FalconRotaryEmbedding forward method #1430

iosonopersia opened this issue Oct 5, 2023 · 1 comment · Fixed by #1431

Comments

@iosonopersia
Copy link

Hi everyone, I was experimenting with the "tiiuae/falcon-7b-instruct" model and I found an issue.

My setup:

optimum==1.13.2
torch==2.0.1+cu118
transformers==4.34.0

The problem is that, since this PR (huggingface/transformers#26137) was merged (i.e. v4.34.0 onwards), Falcon models require the "position_ids" argument when calling the forward() pass of the FalconRotaryEmbedding module (see here).

I saw that thanks to PR #1421, the falcon_forward() method now accepts any additional positional argument, most notably including the "position_ids" one (which is in turn required by the self.maybe_rotary() method, as it appears to call FalconRotaryEmbedding.forward()):

past_kv_length = 0 if layer_past is None else layer_past[0].shape[1]
query_layer, key_layer = self.maybe_rotary(query_layer, key_layer, past_kv_length)

My workaround fix is the following:

 if "position_ids" in kwargs:
        past_kv_length = 0 if layer_past is None else layer_past[0].shape[1]
        query_layer, key_layer = self.maybe_rotary(query_layer, key_layer, past_kv_length, kwargs["position_ids"])

I don't know if this is the best way to deal with the issue, as it doesn't handle the case in which "position_ids" is not provided (I don't even know if there are cases in which it could happen). Moreover, I didn't check whether the same applies for other LLM models too.

@fxmarty
Copy link
Contributor

fxmarty commented Oct 5, 2023

Hi @iosonopersia, thank you for the report! This is fixed in #1431. So as to keep the maintenance burden low, BetterTransformer for Falcon will be supported only for transformers 4.34 with a meaningful message in case a lower version is installed:

ImportError: FalconAttentionLayerBetterTransformer requires the transformers>=4.34 library but it was not found in your environment. You can install it with pip: pip install -U transformers. Please note that you may need to restart your runtime after installation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants