Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ENHANCEMENT]OpenAI autoinstrumenation, track cacheing tokens number #4903

Open
arizedatngo opened this issue Oct 7, 2024 · 1 comment
Open
Labels
c/traces enhancement New feature or request triage issues that need triage

Comments

@arizedatngo
Copy link
Contributor

arizedatngo commented Oct 7, 2024

Details: Cacheing feature from OpenAI, we should likely track the cached_tokens

https://platform.openai.com/docs/guides/prompt-caching

"usage": { "prompt_tokens": 2006, "completion_tokens": 300, "total_tokens": 2306, "prompt_tokens_details": { "cached_tokens": 1920 }, "completion_tokens_details": { "reasoning_tokens": 0 } }

@arizedatngo arizedatngo added enhancement New feature or request triage issues that need triage labels Oct 7, 2024
@dosubot dosubot bot added the c/traces label Oct 7, 2024
@arizedatngo
Copy link
Contributor Author

arizedatngo commented Oct 8, 2024

Additional thought: Might be nice to have the change reflected in the UI

" it will be more consistent if we show the cached_tokens with other existing token count right?
btw, we have not logged cached_tokens yet, will let you know when we start to log it"

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
c/traces enhancement New feature or request triage issues that need triage
Projects
Status: 📘 Todo
Development

No branches or pull requests

1 participant