This repo contains the code for running the LLM OS in dev
and prd
:
dev
: A development environment running locally on dockerprd
: A production environment running on AWS ECS
- Clone the git repo
from the
llm-os
dir:
- Create + activate a virtual env:
python3 -m venv aienv
source aienv/bin/activate
- Install
phidata
:
pip install 'phidata[aws]'
- Setup workspace:
phi ws setup
- Copy
workspace/example_secrets
toworkspace/secrets
:
cp -r workspace/example_secrets workspace/secrets
- Optional: Create
.env
file:
cp example.env .env
-
Install docker desktop
-
Export credentials
We use gpt-4o as the LLM, so export your OpenAI API Key
export OPENAI_API_KEY=sk-***
- To use Exa for research, export your EXA_API_KEY (get it from here)
export EXA_API_KEY=xxx
OR set them in the .env
file
- Start the workspace using:
phi ws up
- Open localhost:8501 to view the Streamlit App.
- If FastApi is enabled, Open localhost:8000/docs to view the FastApi docs.
- Stop the workspace using:
phi ws down