Releases: ditrit/leto-modelizer-ai-proxy
Releases · ditrit/leto-modelizer-ai-proxy
Release version 1.0.0
[1.0.0] - 2024/10/15
Added
- Setup the project
- Create Ollama handler in order to generate code through the /api/diagram route
- Handle Modelfiles for Ollama (terraform, kubernetes and github actions)
- Migrate to python 3.12
- Migrate from uvicorn to hypercorn
- Add new Ollama model files (for generate and message)
- Handle new /api/message endpoint for Ollama, that send messages to the Ollama AI and get a response with the associated context
- Added new Gemini model files (for generate and message). Conversation with a context is not supported.
- Added Docker compose (works only with nvidia gpu)
- The initialize script is now separated from the main script. It can be launched from the root folder, anytime.
- Add health endpoint