Skip to content

Commit

Permalink
Update src/transformers/models/llama/modeling_llama.py
Browse files Browse the repository at this point in the history
Co-authored-by: Arthur <[email protected]>
  • Loading branch information
fxmarty and ArthurZucker authored Apr 17, 2024
1 parent a94a441 commit 38fb6f6
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion src/transformers/models/llama/modeling_llama.py
Original file line number Diff line number Diff line change
Expand Up @@ -1070,7 +1070,6 @@ def _update_causal_mask(
return attention_mask
return None

ignore_causal_mask = False
if self.config._attn_implementation == "sdpa":
# For SDPA, when possible, we will rely on its `is_causal` argument instead of its `attn_mask` argument,
# in order to dispatch on Flash Attention 2.
Expand Down

0 comments on commit 38fb6f6

Please sign in to comment.