You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The problem is that, since this PR (huggingface/transformers#26137) was merged (i.e. v4.34.0 onwards), Falcon models require the "position_ids" argument when calling the forward() pass of the FalconRotaryEmbedding module (see here).
I saw that thanks to PR #1421, the falcon_forward() method now accepts any additional positional argument, most notably including the "position_ids" one (which is in turn required by the self.maybe_rotary() method, as it appears to call FalconRotaryEmbedding.forward()):
I don't know if this is the best way to deal with the issue, as it doesn't handle the case in which "position_ids" is not provided (I don't even know if there are cases in which it could happen). Moreover, I didn't check whether the same applies for other LLM models too.
The text was updated successfully, but these errors were encountered:
Hi @iosonopersia, thank you for the report! This is fixed in #1431. So as to keep the maintenance burden low, BetterTransformer for Falcon will be supported only for transformers 4.34 with a meaningful message in case a lower version is installed:
ImportError: FalconAttentionLayerBetterTransformer requires the transformers>=4.34 library but it was not found in your environment. You can install it with pip: pip install -U transformers. Please note that you may need to restart your runtime after installation.
Hi everyone, I was experimenting with the "tiiuae/falcon-7b-instruct" model and I found an issue.
My setup:
The problem is that, since this PR (huggingface/transformers#26137) was merged (i.e. v4.34.0 onwards), Falcon models require the "position_ids" argument when calling the
forward()
pass of theFalconRotaryEmbedding
module (see here).I saw that thanks to PR #1421, the
falcon_forward()
method now accepts any additional positional argument, most notably including the "position_ids" one (which is in turn required by theself.maybe_rotary()
method, as it appears to callFalconRotaryEmbedding.forward()
):optimum/optimum/bettertransformer/models/attention.py
Lines 939 to 940 in 8c296d3
My workaround fix is the following:
I don't know if this is the best way to deal with the issue, as it doesn't handle the case in which "position_ids" is not provided (I don't even know if there are cases in which it could happen). Moreover, I didn't check whether the same applies for other LLM models too.
The text was updated successfully, but these errors were encountered: