Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llama-cli: command not found #1

Closed
tjthejuggler opened this issue Mar 21, 2023 · 5 comments
Closed

llama-cli: command not found #1

tjthejuggler opened this issue Mar 21, 2023 · 5 comments

Comments

@tjthejuggler
Copy link

Thanks so much for making and sharing this!

The first command works perfectly, but when I do the one that starts llama-cli I get 'command not found'


bossbaby@Will-of-Steve:~/projects/llama-cli$ sudo docker run -ti --rm quay.io/go-skynet/llama-cli:latest --instruction "What's an alpaca?" --topk 10000

Alpacas are domesticated animals that are closely related to llamas and camels. They are native to the Andes Mountains in South America, where they were first domesticated by the Incas.

bossbaby@Will-of-Steve:~/projects/llama-cli$ llama-cli --model ~/ggml-alpaca-7b-q4.bin --instruction "What's an alpaca?"
llama-cli: command not found


Also, I saw from the issue post in the alpaca.cpp github that with this project alpaca should be running in memory all the time, but it seems like it has to start up a new instance every time I run that first command, also when i do 'ps aux | grep alpaca' after that first command has completed there seems to be no process with 'alpaca' running. Is it possible with this to get responses as fast as in the original alpaca.cpp, but with this awesome single command API-style system?

@mudler
Copy link
Owner

mudler commented Mar 21, 2023

Hi @tjthejuggler !

Indeed the first command starts a new instance each time, and it is used or for troubleshooting and/or automating things by piping commands to it. To have a long-running instance, start it in API mode:

docker run -p 8080:8080 -ti --rm quay.io/go-skynet/llama-cli:v0.1 api

And in another terminal run inferences with curl:

curl --location --request POST 'http://localhost:8080/predict' --header 'Content-Type: application/json' --data-raw '{
    "text": "What is an alpaca?",
    "topP": 0.8,
    "topK": 50,
    "temperature": 0.7,
    "tokens": 100
}'

The API will keep the model loaded into memory, and it's a long running process

@tjthejuggler
Copy link
Author

hey @mudler, thanks so much for the help! I've got another question, everytime I run it, it makes me download that 3.839gig file again. I don't know where it is downloading it to, I can't seem to find any file that size on my HD. I assumed with it's size that it was the 7B model, so I tried pointing it that model which I already have downloaded, but it still wants to download the 3.839gig file again.

$ sudo docker run -p 8080:8080 -ti --rm quay.io/go-skynet/llama-cli:v0.1 api --model /models/ggml-alpaca-7b-q4.bin
Unable to find image 'quay.io/go-skynet/llama-cli:v0.1' locally
v0.1: Pulling from go-skynet/llama-cli
32fb02163b6b: Already exists
167c7feebee8: Already exists
d6dfff1f6f3d: Already exists
e9cdcd4942eb: Already exists
543368fb39ee: Already exists
5898d990df6b: Already exists
9602be2ba0fe: Already exists
dda7abc9e477: Pull complete
13679b03456b: Pull complete
c5704ac31306: Pull complete
8f2899c04205: Downloading 11.34MB/68.38MB
c829f586020d: Download complete
0837277f1cf1: Downloading 10.27MB/3.839GB
05ea17c3de8f: Download complete

Thanks again, I really appreciate your time and effort!

@mudler
Copy link
Owner

mudler commented Mar 23, 2023

looks like there is something wrong in your docker installation, images shouldn't be cleaned up between calls, how did you installed docker?

@tjthejuggler
Copy link
Author

tjthejuggler commented Mar 23, 2023

@mudler

I have no experience with it, I hadn't even heard of it until setting up your project. All I did to set it up was follow these instructions exactly:

https://docs.docker.com/engine/install/ubuntu/

I will look into debugging it knowing that the issue is that the image is being cleaned up between calls. Thank you!

@tjthejuggler
Copy link
Author

The issue has been solved! when I ran 'sudo docker images' I saw that the image was listed in there and tagged 'latest', but when i was running the command it had ':v0.1' at the end, i switched to it ':latest' and it worked beautifully. Thanks so much, I really appreciate it!

@Noooste Noooste mentioned this issue Aug 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants