-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add llama hugging face apis #7
Conversation
|
||
|
||
class ChatCompletion: | ||
class Choice: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
enclosing classes makes more sense here
LGTM, what do you think @MaximeThoonsen ? |
cache: "pip" | ||
- name: Install dependencies | ||
run: | | ||
python -m pip install --upgrade pip |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks for that !
@@ -72,7 +72,7 @@ cd ../../gpt4all-bindings/python | |||
pip3 install -e . | |||
``` | |||
|
|||
7. Download it to your local machine from [here](https://gpt4all.io/models/ggml-gpt4all-j-v1.3-groovy.bin) and put it in the `genoss/model` directory as `genoss/model/ggml-gpt4all-j-v1.3-groovy.bin` | |||
7. Download it to your local machine from [here](https://gpt4all.io/models/ggml-gpt4all-j-v1.3-groovy.bin) and put it in the `local_models` directory as `local_models/ggml-gpt4all-j-v1.3-groovy.bin` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree
genoss/api/embeddings_routes.py
Outdated
@@ -1,6 +1,6 @@ | |||
from ast import List | |||
from fastapi import APIRouter | |||
from genoss.model.gpt4all_llm import Gpt4AllLLM | |||
from genoss.model.llm.local.gpt4all import Gpt4AllLLM |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agree more clear
|
||
llm_chain = LLMChain(llm=llm, prompt=prompt_template) | ||
response_text = llm_chain(question) | ||
|
||
print("###################") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logger instead of that
response_text = llm_chain(last_messages) | ||
|
||
llm_chain = LLMChain(llm=llm, prompt=prompt_template) | ||
response_text = llm_chain(question) | ||
print("###################") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No description provided.