Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

關於在 LM Studio 使用此模型… #51

Open
zhihmeng opened this issue Jan 3, 2024 · 1 comment
Open

關於在 LM Studio 使用此模型… #51

zhihmeng opened this issue Jan 3, 2024 · 1 comment

Comments

@zhihmeng
Copy link

zhihmeng commented Jan 3, 2024

Hi,

請問一下在 LM Studio 裡面要怎麼設定才能讓這個模型的輸出效果跟示範網站 twllm.com 有一致的效果?

我目前是用 M3 Max 48G RAM 的機器在做部署…模型選用的是 audreyt/Taiwan-LLM-13B-v2.0-chat-GGUF 這個版本…

發現同樣的問題…給出的答案跟 twllm.com 的詳細程度差異很大…不曉得是不是設定上造成的差異…

再麻煩你解惑了…

謝謝…

@adamlin120
Copy link
Collaborator

有可能是量化時 calibration data 的選用問題。 之後的模型會 盡量 量化好一起釋出。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants