Releases: ahyatt/llm
Releases · ahyatt/llm
Open AI compat provider, names, and canceling
- Allow users to change the Open AI URL, to allow for proxies and other services that re-use the API.
- Add
llm-name
andllm-cancel-request
to the API. - Standardize handling of how context, examples and history are folded into
llm-chat-prompt-interactions
.
Upgrade to Google Gemini
- Upgrade Google Cloud Vertex to Gemini - previous models are no longer available.
- Added
llm-gemini
provider, which is an alternate endpoint with alternate (and easier) authentication and setup compared to Cloud Vertex. - Provide default for
llm-chat-asyncto fall back to streaming if not defined for a provider.
Adding llama-cpp, fixing Vertex & Emacs 28.1 incompability
- Add provider =llm-llamacpp=.
- Fix issue with Google Cloud Vertex not responding to messages with a system interaction.
- Fix use of
(pos-eol)which is not compatible with Emacs 28.1.