Unable to load quantized base model with q5_0 in Linux centos machine #2273
Unanswered
ashish1garg
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I quantized the base model with q5_0. It got quantized to 55.3 MB. I am getting this error loading the model
"whisper_model_load: ERROR not all tensors loaded from model file - expected 245, got 3
whisper_init_no_state: failed to load model"
How can i make it work?
Beta Was this translation helpful? Give feedback.
All reactions