Skip to content

Releases: ahyatt/llm

Open AI compat provider, names, and canceling

30 Dec 02:04
Compare
Choose a tag to compare
  • Allow users to change the Open AI URL, to allow for proxies and other services that re-use the API.
  • Add llm-name and llm-cancel-request to the API.
  • Standardize handling of how context, examples and history are folded into llm-chat-prompt-interactions.

Upgrade to Google Gemini

18 Dec 05:47
Compare
Choose a tag to compare
  • Upgrade Google Cloud Vertex to Gemini - previous models are no longer available.
  • Added llm-gemini provider, which is an alternate endpoint with alternate (and easier) authentication and setup compared to Cloud Vertex.
  • Provide default for llm-chat-async to fall back to streaming if not defined for a provider.

Adding llama-cpp, fixing Vertex & Emacs 28.1 incompability

15 Dec 01:36
Compare
Choose a tag to compare
  • Add provider =llm-llamacpp=.
  • Fix issue with Google Cloud Vertex not responding to messages with a system interaction.
  • Fix use of (pos-eol) which is not compatible with Emacs 28.1.