-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(glm3): adapt to glm3 prompt #2620
Conversation
Thanks @silk55 for the contribution! We'd love to add support for ChatGLM3.
|
LGTM. |
Oh, I think system message after should add a \n |
hi there, check this: #2622 I think I follow the format |
@@ -140,6 +140,17 @@ def get_prompt(self) -> str: | |||
elif self.sep_style == SeparatorStyle.CHATGLM: | |||
# source: https://huggingface.co/THUDM/chatglm-6b/blob/1d240ba371910e9282298d4592532d7f0f3e9f3e/modeling_chatglm.py#L1302-L1308 | |||
# source2: https://huggingface.co/THUDM/chatglm2-6b/blob/e186c891cf64310ac66ef10a87e6635fa6c2a579/modeling_chatglm.py#L926 | |||
if self.name == "chatglm3": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you put a link to the source here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi there,
since chatglm3 uses differenet chat format, I do not think we still need to use the 'CHATGLM' separator style and put a if condition to choice the chat format. But putting the chatglm3 chat format into 'CHATGLM' separator style make sense. I suggest an main contributer to dicide do we have to make an new separator style for it @merrymercy
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@silk55 could you add a source?
Hi @lucasjinreal @yanyang1024 @silk55 @ZeyuTeng96. You all added the ChatGLM-3 support (#2618, #2620, #2622). |
closed by #2622 |
https://huggingface.co/THUDM/chatglm3-6b/blob/fc3235f807ef5527af598c05f04f2ffd17f48bab/tokenization_chatglm.py#L184
added adaptation to chatglm3 new prompt