Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature/introducting-conversational-retrieval-tool-agent #2430

Conversation

niztal
Copy link
Contributor

@niztal niztal commented May 17, 2024

The ConversationalQARetriverChain is a very helpful chain in order to have a RAG based on VectorStore Retriever.

One of his major capabilities are querying the vectorstore by the user's input and calling the LLM using the query's results as a placeholder of {context} and combining this {context} as part of the LLM's prompt.

But since it's just a chain it doesn't have the capability to combine Tools.

The only options are (as described over Bug & Question)

  1. Combine a flow based on some Agent (e.g. ToolAgent) with a Chain Tool that connects as a base chain to the ConversationalQARetriverChain - This is a bug which currently not working. I was trying to fix it but it was too cumbersome.
  2. Having an LLM chain with a VectorStoreRetreiverMemory (based on a solution provided by langchain) - It didn't provided the needed solution, it's just using the VectorStore (e.g. pinecone) as a memory for the LLM Chain. In general I think that's a good feature to have on flowise I can have another dedicated PR for that (FYI @HenryHengZJ)
  3. Basing on an existing Agent (e.g. ToolAgent) but enhancing the agent with context, Vector Store Retriever and {context} placeholder as part of the system message

The 3rd option works the best 🚀

image

image

image

@HenryHengZJ please review I would love to get any feedback from you or any other 🙏

@niztal niztal changed the title feature/introducting-openai-conversational-retriever-agent feature/introducting-conversational-retrieval-tool-agent May 18, 2024
@niztal niztal closed this May 18, 2024
@niztal niztal deleted the feature/openai-conversational-retriever-agent branch May 18, 2024 18:38
@niztal niztal restored the feature/openai-conversational-retriever-agent branch May 18, 2024 18:40
@niztal niztal deleted the feature/openai-conversational-retriever-agent branch May 18, 2024 18:40
@niztal niztal restored the feature/openai-conversational-retriever-agent branch May 18, 2024 18:42
@niztal niztal reopened this May 18, 2024
@niztal
Copy link
Contributor Author

niztal commented May 20, 2024

@HenryHengZJ can you please review it? I'd really love to get your feedback about it.

Thanks

@HenryHengZJ
Copy link
Contributor

HenryHengZJ commented May 21, 2024

@niztal thanks for the suggestion!

Question: why do we need the context in system message when user can use retriever tool? From my testing, including the context in system message doesn't work well. For example, I was asking how to install flowise using the following:
image

You can see the correct installation context was placed in the system message, but still it replies I dont know.
image
Trace link here - https://smith.langchain.com/public/6e42424b-29b9-4df2-b14c-073bf2329ccd/r

@HenryHengZJ
Copy link
Contributor

As comparison to using Retriever Tool + Agent, you have much better response:
image
https://smith.langchain.com/public/c9dec40b-f36b-4cfb-898a-8141c09714d4/r

@niztal
Copy link
Contributor Author

niztal commented May 22, 2024

Hi @HenryHengZJ,

Thanks for your feedback I really appreciate that and I was anticipating that 👍

I was investigating your analysis, before answering your great analysis, I think there was some mis-understanding, please let me know if I miss anything, but based on the langsmith reports you provided seems like you manually set the context within the system message.
image

If that's correct, that was not what I mentioned by initiating this new agent. You need to leave the {context} as a placeholder (same as in the Conversational Retrieval QA Chain). The agent would know to fulfill that once needed by querying the Vector Store and prompting the LLM model.

In general, I would say that, this agent's main purpose is to have a bot (RAG) based on Specific knowledge-base that the model (in your case GPT 3.5) was not trained on. In your case GPT3.5 is already familiar with flowise since I guess it was trained on parsing the web.
You can see it by using ChatGPT w/o any needed additional knowledge-base
image

If you would like to have a bot with some specific instructions + specific knowledge-base (a.k.a "context") + executing some tools, at least per my knowledge it is not doable w/o my agent.

Thanks,

@HenryHengZJ
Copy link
Contributor

The {context} is being used as placeholder in system message. Upon asking question, it will look for vector store and retrieve documents, then fill it in, thats what you saw from the trace.

To create a bot that has access to specific knowledge, typically you will use Retriever Tool for that - https://docs.flowiseai.com/use-cases/multiple-documents-qna#agent. And specify instruction in the System Message like:

You are an expert financial analyst that always answers questions with the most relevant information using the tools at your disposal.
These tools have information regarding companies that the user has expressed interest in.
Here are some guidelines that you must follow:
* For financial questions, you must use the tools to find the answer and then write a response.
* Even if it seems like your tools won't be able to answer the question, you must still use them to find the most relevant information and insights. Not using them will appear as if you are not doing your job.
* You may assume that the users financial questions are related to the documents they've selected.
* For any user message that isn't related to financial analysis, respectfully decline to respond and suggest that the user ask a relevant question.
* If your tools are unable to find an answer, you should say that you haven't found an answer but still relay any useful information the tools found.
* Dont ask clarifying questions, just return answer.

The tools at your disposal have access to the following SEC documents that the user has selected to discuss with you:
- Apple Inc (APPL) FORM 10K 2022
- Tesla Inc (TSLA) FORM 10K 2022

The current date is: 2024-01-28

Including the context in the system message usually dont work well because LLM tends to lose focus as the token increase

@niztal
Copy link
Contributor Author

niztal commented May 22, 2024

The {context} is being used as placeholder in system message. Upon asking question, it will look for vector store and retrieve documents, then fill it in, thats what you saw from the trace.

To create a bot that has access to specific knowledge, typically you will use Retriever Tool for that - https://docs.flowiseai.com/use-cases/multiple-documents-qna#agent. And specify instruction in the System Message like:

You are an expert financial analyst that always answers questions with the most relevant information using the tools at your disposal.
These tools have information regarding companies that the user has expressed interest in.
Here are some guidelines that you must follow:
* For financial questions, you must use the tools to find the answer and then write a response.
* Even if it seems like your tools won't be able to answer the question, you must still use them to find the most relevant information and insights. Not using them will appear as if you are not doing your job.
* You may assume that the users financial questions are related to the documents they've selected.
* For any user message that isn't related to financial analysis, respectfully decline to respond and suggest that the user ask a relevant question.
* If your tools are unable to find an answer, you should say that you haven't found an answer but still relay any useful information the tools found.
* Dont ask clarifying questions, just return answer.

The tools at your disposal have access to the following SEC documents that the user has selected to discuss with you:
- Apple Inc (APPL) FORM 10K 2022
- Tesla Inc (TSLA) FORM 10K 2022

The current date is: 2024-01-28

Including the context in the system message usually dont work well because LLM tends to lose focus as the token increase

ok I got your point, one small question:

Does the system message is part of the LLM's prompt and if so does Flowise/LC send it over and over again each iteration with the model? If so, it's seems pretty expensive from tokens' perspective, isn't it?

Thank you.

@HenryHengZJ
Copy link
Contributor

The whole conversation including system message is always sent to the OpenAI API as messages array - https://platform.openai.com/docs/api-reference/chat/create#chat-create-messages

Its the same as if you were to use OpenAI API directly

@toi500
Copy link
Contributor

toi500 commented May 23, 2024

@HenryHengZJ just a quick clarification. The main reason of this PR is because, for some reason, the Chain Tool is not working properly when is attached to a Conversational Retrieval QA Chain as I reported here #2400

So, is the Chain Tool bugged or the fact that it does not work with Conversational Retrieval QA Chain is a technical limitation?

I have some people asking me about this on discord too.

image

@HenryHengZJ
Copy link
Contributor

@toi500 we introduced a new tool - Chatflow Tool:
image

Can you try create a seperate chatflow that has Conversational Retrieval QA Chain, then link it via this new tool?

@toi500
Copy link
Contributor

toi500 commented May 23, 2024

@HenryHengZJ that is an amazing tool, did not know that you added it.

Unfortunately, it doesn't work properly according to my testing. The Tool Agent is able to ask the correct question to the Chatflow Tool, but it receives no response.

image image

Of course, I tested if the other flow works and there was not any problem.

image

@niztal
Copy link
Contributor Author

niztal commented May 23, 2024

@HenryHengZJ that is an amazing tool, did know that you added it.

Unfortunately, it doesn't work properly according to my testing. The Tool Agent is able to ask the correct question to the Chatflow Tool, but it receives no response.

image image
Of course, I tested if the other flow works and there was not any problem.

image

@toi500 please take my commit and test it as well using my agent, I wonder if it'll work for your scenario. I admit I'm not an expert in langchainso I don't know precisely to explain g why but all the tests I've done including the ones offered by @HenryHengZJ , ,unfortunately wasn't accurate enough in comparison to the flow I've done via my agent.

@soumyabhoi
Copy link

It works for me but slow to get response from other chain when using chatflow tool. I wish a agent with a capability of QA and tool calling would be really beneficial.

@toi500
Copy link
Contributor

toi500 commented May 23, 2024

It works for me but slow to get response from other chain when using chatflow tool. I wish a agent with a capability of QA and tool calling would be really beneficial.

Can you share a capture of the toolOutput?

image

I am asking cos OpenAI will made it up to fullfill a response, even if it gets nothing from the Chatflow Tool -> Conversational Retrieval QA Chain

image

@HenryHengZJ
Copy link
Contributor

It definitely works for me though:

1.)
image

2.)
image

3.) the chatflow being called:
image

4.)
image

Check if your chatflow has API key protected, if yes you need to create credential

@HenryHengZJ HenryHengZJ reopened this May 23, 2024
@niztal
Copy link
Contributor Author

niztal commented May 24, 2024

It works for me but slow to get response from other chain when using chatflow tool. I wish a agent with a capability of QA and tool calling would be really beneficial.

@soumyabhoi this is exactly what my agent here is doing, you're welcome to try it and let us know. It works for me perfect

@toi500
Copy link
Contributor

toi500 commented May 24, 2024

@HenryHengZJ just inform that after a lot of testing, the new Chatflow Tool, which is amazing BTW, works. I tried in local and Render and worked as expected.

image

However It does not work in my production deploment on Railway. I will try to contact them as see what they say. I am getting the following error:

FetchError: invalid json response body at https://flowise-production-8ec4.up.railway.app/api/v1/prediction/9b7b16cf-e649-4e32-9f0d-41a157457402 reason: Unexpected token '<', "<!DOCTYPE "... is not valid JSON

image

@HenryHengZJ
Copy link
Contributor

HenryHengZJ commented May 24, 2024

yeah I noticed that on Railway too, debugging. But is using Chatflow Tool + Tool Agent fits your purpose? Does it achieves what you wanted to do in first place?

Edit: fix for not working on Railway - #2482

@niztal
Copy link
Contributor Author

niztal commented May 25, 2024

@HenryHengZJ @toi500

I was "playing" with this new tool for almost two days, I must say it may produce high value in some cases and sounds like this tool can be very effective for many people.
Having said that, for my specific use case, my agent still did a better job, and I'll explain:

  1. From performance/latency perspective it took ~7-8 seconds slower to answer, for me and for my users' experience it's crucial.
  2. From tokens perspective I feel like having a single system message rather than two would be more cost effective.
  3. From maintenance perspective, at least for me it was much easier to use one single flow rather than "jumping" from one chatflow to another.
  4. From quality perspective, my agent answered better answers and used my tools much more effective.
  5. In my case, I need to "hand over" one crucial parameter for one of my tools (function) from the QA chatflow to the main chatflow and unfortunately it didn't work properly so my main agent had to hallucinate this param 😢

Again, it may be individually for my own use-case but I rather leave my agent and use it, if you feel like it's redundant for most of the people we can close this PR and I'll use it only on my forked repo.

Thanks anyway, highly appreciated

@toi500
Copy link
Contributor

toi500 commented May 27, 2024

@niztal, I think your new agent is a great addition to Flowise. I haven't had a chance to use it yet, but I definitely want to.

I never quite understood why Chains can't use tools. Maybe it's a limitation of the LangChain framework, but I'm sure there's a good reason for that.

In any case, the Chatflow tool, along with LangGraph opens up a whole new paradigm here on Flowise. However, I still think it would be incredibly useful to have either your agent or the Chain tool fixed. (I find the latency of the Chatflow tool a bit too much)

Personally, I'm very grateful for all your hard work here to help the comunity and this project.

@HenryHengZJ
Copy link
Contributor

@niztal yep totally get it, and really appreciate you putting the effort on creating this new agent! We're working on a new plugin system that allow community nodes, and you will also get the attribution tag on the node. I think thats a better place to put it, as we also want to encourage more community nodes from you guys! Will leave this PR open for now, until we have that plugin feature in place, then we can migrate it over!

@niztal
Copy link
Contributor Author

niztal commented May 29, 2024

@niztal yep totally get it, and really appreciate you putting the effort on creating this new agent! We're working on a new plugin system that allow community nodes, and you will also get the attribution tag on the node. I think thats a better place to put it, as we also want to encourage more community nodes from you guys! Will leave this PR open for now, until we have that plugin feature in place, then we can migrate it over!

Thanks @HenryHengZJ highly appreciated, that sounds perfect, LMK when this feature is alive.

@soumyabhoi
Copy link

I have been desperately waiting for it from last 1 month and i am unable to lunch into production or beta without it. Any alternative will be immense help. Seems like race against time to have an agent that both can chat and call tools.

@niztal
Copy link
Contributor Author

niztal commented May 30, 2024

I have been desperately waiting for it from last 1 month and i am unable to lunch into production or beta without it. Any alternative will be immense help. Seems like race against time to have an agent that both can chat and call tools.

@soumyabhoi
You're welcome to try my agent for me it works amazing.

I guess once we'll have the community nodes that @HenryHengZJ mentioned above it will globally available. For now I'm using it on my own version, I guess if it fits your use case you can do it as well. LMK if you need any help please.

Thanks

@toi500
Copy link
Contributor

toi500 commented May 30, 2024

@soumyabhoi, until the new plugin is released where the community can add custom nodes (like this one), you can use the new Chatflow Tool that allows the Agent Tool to manage a whole chatflow (where you host your RAG).

https://docs.flowiseai.com/integrations/langchain/tools/chatflow-tool

@soumyabhoi
Copy link

thanks for the suggestion @niztal and @toi500 .. appreciate your effort. I am going with chatflow tool for our beta phase and will switch to new agent when community node will be lunch.

@HenryHengZJ
Copy link
Contributor

@niztal we have added community node support #2902 !

can you grant me access to edit your repo, if not can you pull the latest changes, and add a new property author ? Example - https://docs.flowiseai.com/contributing/building-node#create-calculator-tool

@niztal
Copy link
Contributor Author

niztal commented Aug 8, 2024

@niztal we have added community node support #2902 !

can you grant me access to edit your repo, if not can you pull the latest changes, and add a new property author ? Example - https://docs.flowiseai.com/contributing/building-node#create-calculator-tool

@HenryHengZJ done ✅ sorry for late reply, I was too busy in other tasks.

Please review

@HenryHengZJ
Copy link
Contributor

Thanks @niztal ! Reminder: set SHOW_COMMUNITY_NODES to true in your .env file

@HenryHengZJ HenryHengZJ merged commit f327158 into FlowiseAI:main Aug 28, 2024
2 checks passed
patrickreinan pushed a commit to patrickreinan/Flowise that referenced this pull request Sep 3, 2024
)

* introducting openai-conversational-retriever-agent

* fix lint

* fix build

* rename + update description

* changing agent base from openai to tool agent

* adding author for community agent
patrickreinan pushed a commit to patrickreinan/Flowise that referenced this pull request Sep 3, 2024
)

* introducting openai-conversational-retriever-agent

* fix lint

* fix build

* rename + update description

* changing agent base from openai to tool agent

* adding author for community agent
@toi500
Copy link
Contributor

toi500 commented Sep 12, 2024

@niztal A conceptual question.

Why does this node only import the RESPONSE_TEMPLATE:

image
image

and not the QA_TEMPLATE and REPHRASE_TEMPLATE as the original chain?

image

@niztal niztal deleted the feature/openai-conversational-retriever-agent branch September 12, 2024 20:18
@niztal
Copy link
Contributor Author

niztal commented Sep 12, 2024

@niztal A conceptual question.

Why does this node only import the RESPONSE_TEMPLATE:

image image

and not the QA_TEMPLATE and REPHRASE_TEMPLATE as the original chain?

image

Hi @toi500

TBH I'm not sure why it was long time ago and the current state answered my needs back then. Feel free to contribute and add whatever you need.

Thanks

@toi500
Copy link
Contributor

toi500 commented Sep 12, 2024

@niztal if you dont mind I am going to check it out how it performs with the original prompt structure and I will report back.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants