You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Just by quickly checking I saw that neither HF modeling file, nor the weights were updated.
The error message says that it didn't get norm_2.weight but got post_attention_norm.weight, although it has to be mapped perfectly:
Bug description
It seems that they updated the Gemma v1 2B weights. Something to look into:
We can either fix or remove these. Because there's Gemma 2, not sure why someone would care about Gemma 1. What do you think @Andrei-Aksionov ?
What operating system are you using?
Unknown
LitGPT Version
The text was updated successfully, but these errors were encountered: