-
-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to add Custom OpenAI provider (Unexpected colon in URL) #63
Comments
I think I've made some progress with the code by myself, I hope it actually helps rather just confuse you even more. The problem appears to stem from KoboldCPP using a different API endpoint for image description than standard OpenAI implementations. While the configuration works smoothly with OpenWebUI (which I had assumed used the same OpenAI endpoint and completions), KoboldCPP seems to diverge from this standard. At least this is what I think, I might be completely wrong and I'm happy to corrected. When attempting to use the service/action, I receive an error indicating that the model doesn't exist. To investigate further, I added more debugging lines to the custom component, including code to fetch and log available models. The component appears to parse the models correctly (detecting just one in my case), but curiously, no interaction was logged by KoboldCPP on its end.
|
Thanks for the detailed error report and the code! I'll look into this |
Your parsing is much cleaner than the mess it was before, so using this for now. I'll release a beta so that you can test it. Any feedback is welcome! |
I'm happy it worked! My brain suddenly remembered when I woke up that I hadn't included the filename for the code 😅 I've tested the beta and I managed to add my koboldcpp (thank you!) but still erroring when trying to use the service as it says the model doesn't exist. I'm sure it has something to do with the openai's endpoint used by koboldcpp itself. Unfortunately I'll be working the next two nights so I'm not gonna be of much help. Anyways, thank you so much! |
Thanks for testing! I will try this tomorrow with LocalAI just to be sure, as I know that its API implements the same endpoints as OpenAI. |
Closing this for now. Feel free to reopen if you need more help! |
Bug Description
When configuring a Custom OpenAI provider in Home Assistant's LLMVision integration, the system is incorrectly parsing the provided endpoint URL. It's adding an extra colon (":") after the port number, which is causing connection failures. This occurs even when the input URL is correctly formatted.
Version:
Attempted configuration
Logs
Additional context
The text was updated successfully, but these errors were encountered: