Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Langchain #698

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open

Langchain #698

wants to merge 3 commits into from

Conversation

doomgrave
Copy link

No description provided.

@slundberg
Copy link
Collaborator

Hi @doomgrave , thanks for the PR. Happy to help facilitate langChain interop...but can you give a description of which scenarios this helps support? Thanks!

@doomgrave
Copy link
Author

doomgrave commented Mar 15, 2024

Hi @doomgrave , thanks for the PR. Happy to help facilitate langChain interop...but can you give a description of which scenarios this helps support? Thanks!

Sorry Scott, i pushed to main for error, not really skilled with git!
Anyway i made a simple interface to use the Llama-cpp model loded with Guidance and also be able to use Langchain Embeddings and Langchain generations/chains.

The advantage its simply that you load the model once without loading/unloading it from memory. I've put an use description in the class. Feel free to inspect the idea.

    #llama.cpp embedding models using Guidance
    #To use, you should have the llama-cpp-python and langchain library installed.
    #LlamaCpp istance must have embedding = true.
    #USAGE EXAMPLE (using Chroma database):

        llama2 = guidance.models.LlamaCpp(model=modelPath,n_gpu_layers=-1,n_ctx=4096,embedding = true)
        embeddings = GuidanceLlamaCppEmbeddings(model=llama2)
        vectordb = Chroma(persist_directory={path_to_chromadb}, embedding_function=embeddings)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants