You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There's already support for marian and various LLMs. But sometimes users create their own generic EncoderDecoderModel, e.g.
from transformers import EncoderDecoderModel
from optimum.onnxruntime import ORTModelForSeq2SeqLM
model = EncoderDecoderModel.from_encoder_decoder_pretrained("bert-base-multilingual-cased", "bert-base-multilingual-cased")
model.save_pretrained("model_dir")
# Should be able to load this, but isn't supported yet.
ort_model = ORTModelForSeq2SeqLM.from_pretrained("model_dir", from_transformers=True)
Motivation
The EncoderDecoderModel is generic enough to cover quite a lot of use-cases but this is might be hard too since it can most probably only cover EncoderDecoder of ORT supported LLMs
Your contribution
Maybe, if there's some guidance on how to do so.
The text was updated successfully, but these errors were encountered:
Feature request
There's already support for
marian
and various LLMs. But sometimes users create their own genericEncoderDecoderModel
, e.g.Motivation
The
EncoderDecoderModel
is generic enough to cover quite a lot of use-cases but this is might be hard too since it can most probably only cover EncoderDecoder of ORT supported LLMsYour contribution
Maybe, if there's some guidance on how to do so.
The text was updated successfully, but these errors were encountered: