Skip to content

Falcon 40B. Is any way to run with LocalAI? #1368

Answered by mudler
netandreus asked this question in Q&A
Discussion options

You must be logged in to vote

@netandreus did you tried with falcon/gguf files? GGML are quite outdated and old now. That should be working with the default llama-cpp backend as for now. Also, which version of LocalAI are you trying this with?

Replies: 4 comments 1 reply

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@netandreus
Comment options

Answer selected by lunamidori5
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
bug Something isn't working
3 participants
Converted from issue

This discussion was converted from issue #1353 on November 30, 2023 17:36.