You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Apparently LLMs might have a general understanding of roughly how much 50 characters is but since there's already a max_tokens defined, I don't think it helps to set a character limit, since it's not gonna result in more exact results than just relying on max_tokens.
Why?
It's not that big of a deal, I just thought it's an, admittedly small, waste of prompt tokens for every commit.
Feature request
Apparently LLMs might have a general understanding of roughly how much 50 characters is but since there's already a
max_tokens
defined, I don't think it helps to set a character limit, since it's not gonna result in more exact results than just relying onmax_tokens
.Why?
It's not that big of a deal, I just thought it's an, admittedly small, waste of prompt tokens for every commit.
Alternatives
No response
Additional context
https://help.openai.com/en/articles/4936856-what-are-tokens-and-how-to-count-them
Using ShellGPT to test the character count accuracy of GPT 4
The text was updated successfully, but these errors were encountered: