-
Notifications
You must be signed in to change notification settings - Fork 111
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error in offline mode with trust_remote code
: SFR-Embedding-Mistral and nomic does not work without einops
#185
Comments
Thanks for opening the issue. Did you really try to get nomic running? I would not be concerned about the stacktrace of infinity | NotImplementedError: The model type mistral is not Its just a info warning, that says that the optimum package already uses a better attention implementation for mistral, and no better one is available. nomicpython3 -m venv venv
source ./venv/bin/activate
pip install infinity_emb[all]
pip install einops # einops is a package required just by the custom code of nomic.
infinity_emb --model-name-or-path nomic-ai/nomic-embed-text-v1.5
Mistral@prasannakrish97 Can you try running the above commands and post it here? |
Hello However, we're encountering the following problem for nomic (Nota Bene : Would like to mention that the same model nomic works well with Text Embedding Inference locally but not with infinity ) :
|
Okay, I have shown above that it is possible to run infinity with nomic. Therefore I'll do the following: Try running again with this commands. Also delete all of your preexisting huggingface_hub modules and set a explicit commit. nomic runs with custom modeling code, so be aware that not pinning a specific version will lead to the fact that you execute whatever code from them in any future version.
#195 I'll plan to make it easier to "bake in a model in a dockerfile" - to many people have had issues with that, and it requires to much knowledge into compatible huggingface_hub / sentence_transformers versions, cache path etc. Perhaps give it a try once its merged. |
trust_remote code
: SFR-Embedding-Mistral and nomic does not work without einops
Model description
You have mentioned that sfr-embedding model is supported along with all other huggingface embedding models (ref.nomic).
However, both are not working :
infinity | ERROR 2024-03-21 14:35:59,554 infinity_emb ERROR: acceleration.py:21
infinity | BetterTransformer is not available for model. The
infinity | model type mistral is not yet supported to be used
infinity | with BetterTransformer. Feel free to open an issue
infinity | at https://github.com/huggingface/optimum/issues if
infinity | you would like this model type to be supported.
infinity | Currently supported models are: dict_keys(['albert',
infinity | 'bark', 'bart', 'bert', 'bert-generation',
infinity | 'blenderbot', 'bloom', 'camembert', 'blip-2',
infinity | 'clip', 'codegen', 'data2vec-text', 'deit',
infinity | 'distilbert', 'electra', 'ernie', 'fsmt', 'gpt2',
infinity | 'gptj', 'gpt_neo', 'gpt_neox', 'hubert', 'layoutlm',
infinity | 'm2m_100', 'marian', 'markuplm', 'mbart', 'opt',
infinity | 'pegasus', 'rembert', 'prophetnet', 'roberta',
infinity | 'roc_bert', 'roformer', 'splinter', 'tapas', 't5',
infinity | 'vilt', 'vit', 'vit_mae', 'vit_msn', 'wav2vec2',
infinity | 'xlm-roberta', 'yolos']).. Continue without
infinity | bettertransformer modeling code.
infinity | Traceback (most recent call last):
infinity | File
infinity | "/app/infinity_emb/transformer/acceleration.py",
infinity | line 19, in to_bettertransformer
infinity | model = BetterTransformer.transform(model)
infinity | File "/usr/lib/python3.10/contextlib.py", line 79,
infinity | in inner
infinity | return func(*args, **kwds)
infinity | File
infinity | "/app/.venv/lib/python3.10/site-packages/optimum/bet
infinity | tertransformer/transformation.py", line 234, in
infinity | transform
infinity | raise NotImplementedError(
infinity | NotImplementedError: The model type mistral is not
infinity | yet supported to be used with BetterTransformer.
Open source status
Provide useful links for the implementation
No response
The text was updated successfully, but these errors were encountered: