Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Template failed loading: template: prompt:1:8: executing "prompt" at <.RoleName>: can't evaluate field RoleName in type model.PromptTemplateData #1151

Closed
yourtiger opened this issue Oct 9, 2023 · 5 comments
Assignees
Labels
bug Something isn't working

Comments

@yourtiger
Copy link

LocalAI version:

quay.io/go-skynet/local-ai:latest

Environment, CPU architecture, OS, and Version:

Linux localhost.localdomain 3.10.0-1160.99.1.el7.x86_64 #1 SMP Wed Sep 13 14:19:20 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux

Describe the bug

I user Llama2-chat-13B and use postman to ask questions。LocalAI error
image

To Reproduce

postman pic
image

models folder
image

llama2.yaml
image

llama2-chat.tmpl
image

llama2-completion.tmpl
image

Expected behavior

Logs

Additional context

@yourtiger yourtiger added the bug Something isn't working label Oct 9, 2023
@localai-bot
Copy link
Contributor

⚠️⚠️⚠️⚠️⚠️

Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!

but.... I can also be funny or helpful 😸 and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.

Don't engage in conversation with me,
I don't support (yet) replying!

⚠️⚠️⚠️⚠️⚠️

I'm sorry to hear that you are having issues with LocalAI. From the information provided, it seems like there might be an issue with the models or the configuration files.
To troubleshoot this further, I would recommend checking the following:

  1. Ensure that the models directory is in the correct location and has the correct permissions.
  2. Check the llama2.yaml and llama2-chat.tmpl files to ensure they are correctly formatted and contain the correct information.
  3. Run LocalAI with the --debug flag to generate more detailed logs.
  4. Check the OpenAI API key and host settings in the frontend configuration file to ensure they are correct.
    If you are still having issues after checking these steps, please provide more details or log files so we can better assist you.

Sources:

@yourtiger
Copy link
Author

https://localai.io/advanced/
Now I know the reason, if system_prompt is used in the template, then system_promptmust be defined in the yaml file of the model

@zenyanbo
Copy link

https://localai.io/advanced/ 现在我知道原因了,如果模板中使用了system_prompt,那么system_prompt必须定义在模型的yaml文件中

Hello, I I also encountered this problem. It's shows executing "prompt" at <.RoleName>: can't evaluate field RoleName in type model.PromptTemplateData. My templ is

<|im_start|>{{if eq .RoleName "assistant"}}assistant{{else if eq .RoleName "system"}}system{{else if eq .RoleName "user"}}user{{end}}
{{if .Content}}{{.Content}}{{end}}
<|im_end|>

How do you let it work? Can you provide me with your configuration file for reference? Thx

@yourtiger
Copy link
Author

我不清楚LocalAI的模版是否支持RoleName,或者你写的格式。我参考的模版支持参数是https://localai.io/advanced/上的说明
我当时遇到的问题是,在我的模版中使用了system_prompt变量,但是没有在启动的llama2.yaml文件中定义system_prompt,所以,我只需要在llama2.yaml文件中增加system_prompt的说明如下:
image

@zenyanbo
Copy link

Oh, I see RoleName in your config llama2.yaml. So, the error disappears after you add system_prompt, it means LocalAI support RoleName? I also saw similar jinja format in examples, but it report an error as you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants