-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding memory to the prompt node solely #5706
Comments
Hi @HGamalElDin For use cases like this we have the Or just try out the following code: from haystack.nodes import PromptNode
from haystack.agents.conversational import ConversationalAgent
prompt_node = PromptNode('gpt-3.5-turbo', api_key=MY_API_KEY, max_length=256)
conversational_agent = ConversationalAgent(prompt_node=prompt_node)
conversational_agent.run("your query here") |
What about if we need a pipeline that uses memory as a variable in the prompt template to fill out? Ie. I want to do somethings to get documents, then pass memory as well. Also how can we create arbitrary variables that can be filled in as part of a prompt node? not just the defaults like documents, memory, query, etc..? |
Thank you for getting back to me so quickly. However, I already created a pipeline that has a file converter custom node, Retriever, and Prompt Node that takes query, memory, and documents as inputs) but I'm encountering unexpected errors from the agent around the needed variables in the prompt (documents, memory) Would you please @julian-risch showcase how to implement this case using conversational agents, as the documentation lacks a little clarity? |
Please confirm if I can not implement the RAG pipeline with a chat memory (without using agents)?
So here I have two variables. I tried the following snippet, however, the default memory injected in the agent doesn't store the Human input at all.
When I call I got """Human: input\nAI: Paris is the capital of France.\nHuman: input\nAI: During Ramadan, working hours are 10:30 AM to 4:00 PM, Sunday to Thursday.\nHuman: input\nAI: France's capital is Paris.\n""" Also, the prompt template I provided should control the responses that no random answers should be generated! only from the context or the response should be a specific message as provided! it works well if I only use the prompt node, but with agent It's like the prompt template doesn't work, what could be the possible cause of this? I need help please! @ZanSara @julian-risch |
I can see the agent is making a lot of functions easier to create an extensive application with many tools and integrations. However, I'm working on a simple application that needs exactly one prompt node but with memory. Is there any workaround to do so in the meantime?
Actually, adding memory to the prompt node only already exists in many other tools, here we only have to use agents.
The text was updated successfully, but these errors were encountered: