Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to display Chinese response with llm-tgi and react UI #589

Closed
bjzhjing opened this issue Sep 2, 2024 · 1 comment
Closed

Failed to display Chinese response with llm-tgi and react UI #589

bjzhjing opened this issue Sep 2, 2024 · 1 comment
Assignees
Labels
DEV features

Comments

@bjzhjing
Copy link
Contributor

bjzhjing commented Sep 2, 2024

Helm install OPEA ChatQnA example, use llm-tgi and react UI, while ask Chinese question, the response is garbled. Verify that TGI responses correctly but llm-tgi microservice converts it to be messy.

@bjzhjing
Copy link
Contributor Author

bjzhjing commented Sep 3, 2024

Close this one for the issue is addressed by opea-project/GenAIExamples#713

@bjzhjing bjzhjing closed this as completed Sep 3, 2024
lkk12014402 pushed a commit that referenced this issue Sep 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
DEV features
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants