Skip to content

Commit

Permalink
Fix issues with the VisualQnA instructions (#809)
Browse files Browse the repository at this point in the history
Signed-off-by: Dina Suehiro Jones <[email protected]>
Signed-off-by: dmsuehir <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
  • Loading branch information
dmsuehir and pre-commit-ci[bot] authored Sep 14, 2024
1 parent edcc50f commit bc4bbfa
Show file tree
Hide file tree
Showing 3 changed files with 23 additions and 13 deletions.
2 changes: 1 addition & 1 deletion VisualQnA/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ RUN pip install --no-cache-dir --upgrade pip && \

COPY ./visualqna.py /home/user/visualqna.py

ENV PYTHONPATH=$PYTHONPATH:/home/user/GenAIComps
ENV PYTHONPATH=/home/user/GenAIComps

USER user

Expand Down
15 changes: 10 additions & 5 deletions VisualQnA/docker_compose/intel/cpu/xeon/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,10 +41,12 @@ git clone https://github.com/opea-project/GenAIComps.git
cd GenAIComps
```

### 1. Build LVM Image
### 1. Build LVM and NGINX Docker Images

```bash
docker build --no-cache -t opea/lvm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/lvms/tgi-llava/Dockerfile .

docker build --no-cache -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/nginx/Dockerfile .
```

### 2. Build MegaService Docker Image
Expand All @@ -55,7 +57,7 @@ To construct the Mega Service, we utilize the [GenAIComps](https://github.com/op
git clone https://github.com/opea-project/GenAIExamples.git
cd GenAIExamples/VisualQnA
docker build --no-cache -t opea/visualqna:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f Dockerfile .
cd ../../..
cd ../..
```

### 3. Build UI Docker Image
Expand All @@ -65,7 +67,7 @@ Build frontend Docker image via below command:
```bash
cd GenAIExamples/VisualQnA/ui
docker build --no-cache -t opea/visualqna-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f docker/Dockerfile .
cd ../../../..
cd ../../..
```

### 4. Pull TGI Xeon Image
Expand All @@ -74,12 +76,13 @@ cd ../../../..
docker pull ghcr.io/huggingface/text-generation-inference:sha-e4201f4-intel-cpu
```

Then run the command `docker images`, you will have the following 4 Docker Images:
Then run the command `docker images`, you will have the following 5 Docker Images:

1. `ghcr.io/huggingface/text-generation-inference:sha-e4201f4-intel-cpu`
2. `opea/lvm-tgi:latest`
3. `opea/visualqna:latest`
4. `opea/visualqna-ui:latest`
5. `opea/nginx`

## 🚀 Start Microservices

Expand All @@ -98,7 +101,7 @@ export host_ip="External_Public_IP"
**Append the value of the public IP address to the no_proxy list**

```
export your_no_proxy=${your_no_proxy},"External_Public_IP"
export your_no_proxy="${your_no_proxy},${host_ip}"
```

```bash
Expand Down Expand Up @@ -131,6 +134,8 @@ docker compose -f compose.yaml up -d

Follow the instructions to validate MicroServices.

> Note: If you see an "Internal Server Error" from the `curl` command, wait a few minutes for the microserver to be ready and then try again.
1. LLM Microservice

```bash
Expand Down
19 changes: 12 additions & 7 deletions VisualQnA/docker_compose/intel/hpu/gaudi/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,12 @@ git clone https://github.com/opea-project/GenAIComps.git
cd GenAIComps
```

### 2. Build LLM Image
### 2. Build LVM and NGINX Docker Images

```bash
docker build --no-cache -t opea/lvm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/lvms/tgi-llava/Dockerfile .

docker build --no-cache -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/nginx/Dockerfile .
```

### 3. Pull TGI Gaudi Image
Expand All @@ -31,27 +33,28 @@ To construct the Mega Service, we utilize the [GenAIComps](https://github.com/op

```bash
git clone https://github.com/opea-project/GenAIExamples.git
cd GenAIExamples/VisualQnA/docker
cd GenAIExamples/VisualQnA
docker build --no-cache -t opea/visualqna:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f Dockerfile .
cd ../../..
cd ../..
```

### 5. Build UI Docker Image

Build frontend Docker image via below command:

```bash
cd GenAIExamples/VisualQnA//
cd GenAIExamples/VisualQnA/ui
docker build --no-cache -t opea/visualqna-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f ./docker/Dockerfile .
cd ../../../..
cd ../../..
```

Then run the command `docker images`, you will have the following 4 Docker Images:
Then run the command `docker images`, you will have the following 5 Docker Images:

1. `opea/llava-tgi:latest`
1. `ghcr.io/huggingface/tgi-gaudi:2.0.4`
2. `opea/lvm-tgi:latest`
3. `opea/visualqna:latest`
4. `opea/visualqna-ui:latest`
5. `opea/nginx`

## 🚀 Start MicroServices and MegaService

Expand Down Expand Up @@ -89,6 +92,8 @@ docker compose -f compose.yaml up -d

Follow the instructions to validate MicroServices.

> Note: If you see an "Internal Server Error" from the `curl` command, wait a few minutes for the microserver to be ready and then try again.
1. LLM Microservice

```bash
Expand Down

0 comments on commit bc4bbfa

Please sign in to comment.