You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for your great work. I encountered an issue while trying to fine-tune LLaMA 3.1 and came here for reference. I was looking for the LLaMA 3.1 RoPE function change, but I couldn't find it in your repository. Based on this PR, it seems like it hasn't been added yet.
Thanks, I was out for 2 weeks and am just reading this. I may not have the bandwidth to address this immediately due to other issues on my list, but I hope to get to it soon.
Bug description
I'm re-posting this as the original issue created with a question tag is not receiving attention.
Per #1699
It seems that Llama 3.1 RoPE is not implemented. The Llama 3.1 RoPE is non-standard, according to https://github.com/huggingface/transformers/blob/5c1027bf09717f664b579e01cbb8ec3ef5aeb140/src/transformers/modeling_rope_utils.py#L329-L347
Finetuning will lead to spelling/grammar mistakes.
From @zzhhjjj:
Dear LitGPT Maintainer,
What operating system are you using?
Linux
LitGPT Version
The text was updated successfully, but these errors were encountered: