We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi I converted Falcon-11b using ct2-transformers-converter but I get the following error when trying to use the model.
Ctranslate version: 4.1.0
Conversion command:
ct2-transformers-converter --model tiiuae/falcon-11b --output_dir falcon-11b-base-ct2 --quantization int8 --trust_remote_code
Using model:
>>> import ctranslate2 >>> from transformers import AutoTokenizer >>> model = ctranslate2.Generator("falcon-11b-base-ct2", device='cpu') >>> tokenizer = AutoTokenizer.from_pretrained("tiiuae/falcon-11b") >>> outputs = model.generate_batch([tokenizer.convert_ids_to_tokens(tokenizer.encode("Falcon 11b is a new LLM"))], sampling_topk=10, max_length=200, include_prompt_in_result=False) Traceback (most recent call last): File "<stdin>", line 1, in <module> ValueError: axis 2 has dimension 6144 but expected 4352 >>>
I assume this is due to the new model being unsupported, will falcon 11b be supported by CT2? Thanks
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Hi
I converted Falcon-11b using ct2-transformers-converter but I get the following error when trying to use the model.
Ctranslate version: 4.1.0
Conversion command:
Using model:
I assume this is due to the new model being unsupported, will falcon 11b be supported by CT2?
Thanks
The text was updated successfully, but these errors were encountered: