You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for your great work. I encountered an issue while trying to fine-tune LLaMA 3.1 and came here for reference. I was looking for the LLaMA 3.1 RoPE function change, but I couldn't find it in your repository. Based on this PR, it seems like it hasn't been added yet.
We ran into the same issue when doing full finetuning on Llama3.1 8b. We observed that after a few training samples, the model started to generate responses with mispelt words and grammatical errors. @zzhhjjj I'll suggest that you change this to a bug label if you can.
Dear LitGPT Maintainer,
Thank you for your great work. I encountered an issue while trying to fine-tune LLaMA 3.1 and came here for reference. I was looking for the LLaMA 3.1 RoPE function change, but I couldn't find it in your repository. Based on this PR, it seems like it hasn't been added yet.
https://github.com/Lightning-AI/litgpt/pull/1619/files#diff-3b8a58a4d021803b3171b886bb9162fd659e671131f3f61036f9210cb5d0bc7c
Reference: https://github.com/huggingface/transformers/blob/5c1027bf09717f664b579e01cbb8ec3ef5aeb140/src/transformers/modeling_rope_utils.py#L329-L347
Thanks for your help.
The text was updated successfully, but these errors were encountered: