You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Additional thought: Might be nice to have the change reflected in the UI
" it will be more consistent if we show the cached_tokens with other existing token count right?
btw, we have not logged cached_tokens yet, will let you know when we start to log it"
Details: Cacheing feature from OpenAI, we should likely track the cached_tokens
https://platform.openai.com/docs/guides/prompt-caching
"usage": { "prompt_tokens": 2006, "completion_tokens": 300, "total_tokens": 2306, "prompt_tokens_details": { "cached_tokens": 1920 }, "completion_tokens_details": { "reasoning_tokens": 0 } }
The text was updated successfully, but these errors were encountered: