Skip to content

Latest commit

 

History

History
88 lines (60 loc) · 2.67 KB

README.md

File metadata and controls

88 lines (60 loc) · 2.67 KB

Needle RAG tools for Haystack

PyPI - Version PyPI - Python Version

This package provides NeedleDocumentStore and NeedleEmbeddingRetriever component for use in Haystack projects.

Usage ⚡️

Get started by installing the package via pip.

pip install needle-haystack-ai

API Keys

We will show you building a common RAG pipeline using Needle tools and OpenAI generator. For using these tools you must set your environment variables, NEEDLE_API_KEY and OPENAI_API_KEY respectively.

You can get your Needle API key from from Developer settings.

Example Pipeline 🧱

In Needle document stores are called collections. For detailed information, see our docs. You can create a reference to your Needle collection using NeedleDocumentStore and use NeedleEmbeddingRetriever to retrieve documents from it.

from needle_haystack import NeedleDocumentStore, NeedleEmbeddingRetriever

document_store = NeedleDocumentStore(collection_id="<your-collection-id>")
retriever = NeedleEmbeddingRetriever(document_store=document_store)

Use the retriever in a Haystack pipeline. Example:

from haystack import Pipeline
from haystack.components.generators import OpenAIGenerator
from haystack.components.builders import PromptBuilder

prompt_template = """
Given the following retrieved documents, generate a concise and informative answer to the query:

Query: {{query}}
Documents:
{% for doc in documents %}
    {{ doc.content }}
{% endfor %}

Answer:
"""

prompt_builder = PromptBuilder(template=prompt_template)
llm = OpenAIGenerator()

# Add components to pipeline
pipeline = Pipeline()
pipeline.add_component("retriever", retriever)
pipeline.add_component("prompt_builder", prompt_builder)
pipeline.add_component("llm", llm)

# Connect the components
pipeline.connect("retriever", "prompt_builder.documents")
pipeline.connect("prompt_builder", "llm")

Run your RAG pipeline:

prompt = "What is the topic of the news?"

result = basic_rag_pipeline.run({
    "retriever": {"text": prompt},
    "prompt_builder": {"query": prompt}
})

# Print final answer
print(result['llm']['replies'][0])

Support 📞

For detailed guides, take a look at our docs. If you have questions or requests you can contact us in our Discord channel.

License

needle-haystack is distributed under the terms of the MIT license.