Skip to content

Commit

Permalink
Update SearchQnA document and compose.yaml (#774)
Browse files Browse the repository at this point in the history
Signed-off-by: Wang, Xigui <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
  • Loading branch information
xiguiw and pre-commit-ci[bot] committed Sep 11, 2024
1 parent d2bab99 commit 5c67204
Show file tree
Hide file tree
Showing 2 changed files with 16 additions and 4 deletions.
18 changes: 15 additions & 3 deletions SearchQnA/docker_compose/intel/cpu/xeon/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,13 +46,24 @@ docker build --no-cache -t opea/searchqna:latest --build-arg https_proxy=$https_
cd ../../..
```

### 7. Build UI Docker Image

Build frontend Docker image via below command:

```bash
cd GenAIExamples/SearchQnA/ui
docker build --no-cache -t opea/opea/searchqna-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f ./docker/Dockerfile .
cd ../../../..
```

Then run the command `docker images`, you will have following images ready:

1. `opea/embedding-tei:latest`
2. `opea/web-retriever-chroma:latest`
3. `opea/reranking-tei:latest`
4. `opea/llm-tgi:latest`
5. `opea/searchqna:latest`
6. `opea/searchqna-ui:latest`

## 🚀 Set the environment variables

Expand All @@ -65,11 +76,12 @@ export GOOGLE_API_KEY=<your google api key>
export HUGGINGFACEHUB_API_TOKEN=<your HF token>

export EMBEDDING_MODEL_ID=BAAI/bge-base-en-v1.5
export TEI_EMBEDDING_ENDPOINT=http://$host_ip:3001
export TEI_EMBEDDING_ENDPOINT=http://${host_ip}:3001
export RERANK_MODEL_ID=BAAI/bge-reranker-base
export TEI_RERANKING_ENDPOINT=http://$host_ip:3004
export TEI_RERANKING_ENDPOINT=http://${host_ip}:3004
export BACKEND_SERVICE_ENDPOINT=http://${host_ip}:3008/v1/searchqna

export TGI_LLM_ENDPOINT=http://$host_ip:3006
export TGI_LLM_ENDPOINT=http://${host_ip}:3006
export LLM_MODEL_ID=Intel/neural-chat-7b-v3-3

export MEGA_SERVICE_HOST_IP=${host_ip}
Expand Down
2 changes: 1 addition & 1 deletion SearchQnA/docker_compose/intel/cpu/xeon/compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,7 @@ services:
- LLM_SERVICE_PORT=${LLM_SERVICE_PORT}
ipc: host
restart: always
searchqna-gaudi-ui-server:
searchqna-xeon-ui-server:
image: ${REGISTRY:-opea}/searchqna-ui:${TAG:-latest}
container_name: searchqna-xeon-ui-server
depends_on:
Expand Down

0 comments on commit 5c67204

Please sign in to comment.