-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Template failed loading: template: prompt:1:8: executing "prompt" at <.RoleName>: can't evaluate field RoleName in type model.PromptTemplateData #1151
Comments
|
https://localai.io/advanced/ |
Hello, I I also encountered this problem. It's shows
How do you let it work? Can you provide me with your configuration file for reference? Thx |
我不清楚LocalAI的模版是否支持RoleName,或者你写的格式。我参考的模版支持参数是https://localai.io/advanced/上的说明 |
Oh, I see RoleName in your config llama2.yaml. So, the error disappears after you add system_prompt, it means LocalAI support RoleName? I also saw similar jinja format in examples, but it report an error as you. |
LocalAI version:
quay.io/go-skynet/local-ai:latest
Environment, CPU architecture, OS, and Version:
Linux localhost.localdomain 3.10.0-1160.99.1.el7.x86_64 #1 SMP Wed Sep 13 14:19:20 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Describe the bug
I user Llama2-chat-13B and use postman to ask questions。LocalAI error
To Reproduce
postman pic
models folder
llama2.yaml
llama2-chat.tmpl
llama2-completion.tmpl
Expected behavior
Logs
Additional context
The text was updated successfully, but these errors were encountered: