How do OpenAI callback handler work ? #7432
-
I'm specifically looking at this snippet from langchain.callbacks import get_openai_callback
from langchain.llms import OpenAI
llm = OpenAI(temperature=0)
with get_openai_callback() as cb:
llm("What is the square root of 4?")
print(cb) My understanding is that EDIT: |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Answer generated by a 🤖 AnswerYou're correct in your understanding that In the LangChain framework, the This is one potential explanation based on the code snippets and links you've provided. If you find this explanation accurate and feel it's not covered in the existing documentation, I encourage you to make a pull request to update the documentation. This would greatly benefit other users who might encounter the same confusion. Sources
This response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. |
Beta Was this translation helpful? Give feedback.
-
The correct answer can be found in the "EDIT" part of my question.
|
Beta Was this translation helpful? Give feedback.
The correct answer can be found in the "EDIT" part of my question.
More in detail:
get_openai_callback
sets the value of a context var calledopenai_context_var
to an object of typeOpenAICallBackHandler
which tracks the token usage and the costBaseOpenAI
create a list of callbacks from theconfigure
method of theCallbackManager
objecton_llm_end
method of theOpenAICallBackHandler
is applied on the generated tokens once the generation is finished to update its the attributes