We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Please answer the following questions for yourself before submitting an issue.
I am performing model conversions as per the guidelines in this PR and using the llama-bpe configs fetched:
llama-bpe
#6920 (comment)
...
The recent convert-hf-to-gguf-update.py script fetches the llama-bpe configs, but these reflect the ones from the Base model.
Recently, within the last week, there was a change to these settings in the meta-llama/Meta-Llama-3-8B-Instruct repo.
Is this change in the Instruct EOS pertinent to the current conversion process?
To add: I haven't noticed any issues so far using either the Base model configs or the Instruct model configs.
The text was updated successfully, but these errors were encountered:
This issue was closed because it has been inactive for 14 days since being marked as stale.
Sorry, something went wrong.
🤗
No branches or pull requests
Prerequisites
Please answer the following questions for yourself before submitting an issue.
Question/Conjecture:
I am performing model conversions as per the guidelines in this PR and using the
llama-bpe
configs fetched:#6920 (comment)
...
The recent convert-hf-to-gguf-update.py script fetches the llama-bpe configs, but these reflect the ones from the Base model.
Recently, within the last week, there was a change to these settings in the meta-llama/Meta-Llama-3-8B-Instruct repo.
Is this change in the Instruct EOS pertinent to the current conversion process?
To add:
I haven't noticed any issues so far using either the Base model configs or the Instruct model configs.
The text was updated successfully, but these errors were encountered: