You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1522 # If we don't have any hooks, we want to skip the rest of the logic in
1523 # this function, and just call forward.
1524 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
1525 or _global_backward_pre_hooks or _global_backward_hooks
1526 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1527 return forward_call(*args, **kwargs)
1529 try:
1530 result = None
TypeError: Linear.forward() got an unexpected keyword argument 'scale'
Logs
No response
System Info
- `diffusers` version: 0.22.0.dev0
- Platform: Linux-5.15.0-76-generic-x86_64-with-glibc2.35
- Python version: 3.10.13
- PyTorch version (GPU?): 2.1.0+cu121 (True)
- Huggingface_hub version: 0.17.3
- Transformers version: 4.34.0
- Accelerate version: 0.23.0
- xFormers version: 0.0.22.post4
- Using GPU in script?: <fill in>
- Using distributed or parallel set-up in script?: <fill in>
Who can help?
No response
The text was updated successfully, but these errors were encountered:
Describe the bug
PEFT LoRA not working with xformers.
Reproduction
Logs
No response
System Info
Who can help?
No response
The text was updated successfully, but these errors were encountered: