This repo contains the code for running an LLM App in 2 environments:
dev
: A development environment running locally on dockerprd
: A production environment running on AWS ECS
- Clone the git repo
from the
llm-app
dir:
- Create + activate a virtual env:
python3 -m venv aienv
source aienv/bin/activate
- Install
phidata
:
pip install phidata
- Setup workspace:
phi ws setup
- Copy
workspace/example_secrets
toworkspace/secrets
:
cp -r workspace/example_secrets workspace/secrets
- Optional: Create
.env
file:
cp example.env .env
-
Install docker desktop
-
Set OpenAI Key
Set the OPENAI_API_KEY
environment variable using
export OPENAI_API_KEY=sk-***
OR set in the .env
file
- Start the workspace using:
phi ws up
- Open localhost:8501 to view the Streamlit App.
- Open localhost:8000/docs to view the FastApi docs.
- If Jupyter is enabled, open localhost:8888 to view JupyterLab UI.
- Stop the workspace using:
phi ws down