Skip to content

Commit

Permalink
fix: [N, 1] pattern should take N to create 1d causal mask
Browse files Browse the repository at this point in the history
  • Loading branch information
davidqqq committed Nov 14, 2024
1 parent 210e32a commit 7814c5c
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion xformers/components/attention/global_tokens.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ def __init__(
self.force_sparsity = force_sparsity

if causal:
self.attention_mask &= causal_1d_pattern(attention_query_mask.shape[1])
self.attention_mask &= causal_1d_pattern(attention_query_mask.shape[0])

self.attention_mask = (
sparsify(self.attention_mask)
Expand Down

0 comments on commit 7814c5c

Please sign in to comment.