A simple HTTP proxy which allows you to stream responses from Open AI's API. This can be used to secure your API key and to add additional functionality to your API calls.
Streaming Open AI API responses with Server-Side Events and Golang
# Set your API key
$ export OA_API_KEY=sk-...
# Run the proxy
$ go run main.go
# Build the image
$ docker build -t openai-proxy .
# Run the image
$ docker run -p 8080:8080 -e OA_API_KEY=sk-... openai-proxy
curl -s -N -X POST -d '{"messages": [{"role": "user", "content": "Hello world!"}]}' http://localhost:8080/message
event:message
data:{"timestamp":1702330536,"content":""}
event:message
data:{"timestamp":1702330536,"content":"Hello"}
event:message
data:{"timestamp":1702330536,"content":"!"}
event:message
data:{"timestamp":1702330536,"content":" How"}
event:message
data:{"timestamp":1702330536,"content":" can"}
event:message
data:{"timestamp":1702330536,"content":" I"}
event:message
data:{"timestamp":1702330536,"content":" assist"}
event:message
data:{"timestamp":1702330536,"content":" you"}
event:message
data:{"timestamp":1702330536,"content":" today"}
event:message
data:{"timestamp":1702330536,"content":"?"}
event:message
data:{"timestamp":1702330536,"content":""}
This project has a very simple demo UI which can be used to test the proxy. The UI is built with React and can be found in the ui/chat-demo
directory.
cd ui/chat-demo && yarn install && yarn start
...
Compiled successfully!
You can now view chat-demo in the browser.
Local: http://localhost:3000
On Your Network: http://192.168.0.238:3000
Note that the development build is not optimized.
To create a production build, use yarn build.
webpack compiled successfully