Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in function calling with ollama #70

Closed
ultronozm opened this issue Aug 25, 2024 · 1 comment · Fixed by #74
Closed

Error in function calling with ollama #70

ultronozm opened this issue Aug 25, 2024 · 1 comment · Fixed by #74

Comments

@ultronozm
Copy link
Contributor

Evaluating

(let ((provider (make-llm-ollama :chat-model "mistral:latest")))
  (llm-tester-function-calling-conversation-sync provider))

yields

LLM request failed with code 400: Bad Request (additional information: ((error . json: cannot unmarshal array into Go struct field ChatRequest.messages of type string)))

I get the same error with "llama3.1-latest".

@ahyatt
Copy link
Owner

ahyatt commented Aug 25, 2024

Good bug, thanks for this. Ollama doesn't really have any documentation on function conversations as far as I can tell. I explored some examples and saw some differences to what I was doing, but even fixes those differences, I wasn't able to get it to work, yet. Let me know the priority on your side, but it's going to be difficult to fix this one.

ultronozm added a commit to ultronozm/llm that referenced this issue Aug 28, 2024
* llm-ollama.el (llm-provider-chat-request): Wrap contents via
json-encode, if not already a string.

Fixes ahyatt#70.
@ahyatt ahyatt closed this as completed in 454f8c0 Aug 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants