You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for this!
I found an issue in the check_lora_weights.py. The script showed 0.0 for some layers even if the weights are trained, because of the overflow in torch.mean. Sorry for that.
I've fixed the issue, and the script now shows a proper result.
--save_precision=fp16 will cause a lot of 0's in lora modules.
--save_precision=float seems to work fine.
The text was updated successfully, but these errors were encountered: