Replies: 1 comment
-
I found a similar discussion that might help you resolve the issue. In the discussion, it was suggested to ensure that the Here's how you can modify your code to ensure the
Here's the updated code: from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain_core.chat_history import InMemoryChatMessageHistory
from langchain_aws.chat_models.bedrock import ChatBedrock
from langchain_core.prompts import PromptTemplate
from langchain.callbacks.streaming_aiter import AsyncIteratorCallbackHandler
from langchain_core.messages import HumanMessage, BaseMessage
from langchain_aws.chat_models.bedrock import ChatBedrock
from typing import Any, Dict, List
from langchain.schema.runnable import RunnableConfig
class MyCustomHandler(AsyncIteratorCallbackHandler):
def on_llm_new_token(self, token: str, **kwargs) -> None:
print(f"New token: {token}", end="", flush=True)
def on_llm_end(self, response, **kwargs) -> None:
print("\nLLM generation completed.")
def on_chat_model_start(
self,
serialized: Dict[str, Any],
messages: List[List[BaseMessage]],
**kwargs: Any,
) -> None:
"""Run when LLM starts running."""
print("Streaming chat model started.")
# Define a function to get the chat history for a session
def get_session_history(session_id: str):
return InMemoryChatMessageHistory(session_id=session_id)
chat_bedrock = ChatBedrock(
model_id="anthropic.claude-3-sonnet-20240229-v1:0",
streaming=True,
verbose=True,
callbacks=[MyCustomHandler()] # Attach the custom handler here
)
chain = chat_bedrock
chain_with_history = RunnableWithMessageHistory(
chain,
get_session_history,
input_messages_key=None,
history_messages_key=None
).with_config(RunnableConfig(callbacks=[MyCustomHandler()]))
response = chain_with_history.invoke(
{"messages": [HumanMessage(content="Tell me a joke about parrots.")]},
config={"configurable": {"session_id": "my_session"}},
)
print(response.content) Ensure that your custom LLM class implements the |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
I have this sample code and my
on_llm_new_token
method is never invoked. The response is provided as one string when I print. How to correct this behaviour? I also triedchain_with_history.with_config(RunnableConfig(callbacks=[MyCustomHandler()]))
, but that didn't help.System Info
langchain==0.3.0
langchain-core==0.3.2
langchain-aws==0.2.1
Beta Was this translation helpful? Give feedback.
All reactions