Using local models #143
-
Does Transformers.jl support using locally stored (pre-trained) model weights & tokenizers? I have Llama 2 installed and would like to try using it with this package, but I can't find a load_model() that accepts a filepath or anything like that. Could I get a starting point for how to use local models with this package? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 4 replies
-
Unfortunately, Transformers.jl right now does not support loading local weights & tokenizers. Currently you would need to call those functions that actually load the components from files and then construct the object yourself. |
Beta Was this translation helpful? Give feedback.
-
Dear All,
I guess we are after some help with constructing the object ourselves. I
tried to use the repos that downloaded the HuggingFace data and got stuck
with the download step.
It froze.
I would complete documentation about the structure of:
textenc, embed, encoder, decoder, embed_decode
And even with a clear description how they are organised, I would need help
with how to construct them from scratch (with the types and so on, so that
they would be efficient).
I have some budget for this if anyone wants to attempt it?
Kind regards,
Tobe
…On Fri, Aug 11, 2023 at 3:36 PM Peter ***@***.***> wrote:
What would it take to be able to load Llama 2 easily for users?
It depends on the type of local model you have. For llama 2, there are two
version of models, the format for huggingface transformer and the format
for the meta implementation. The code in Transformers.jl only support the
huggingface transformer. I'm not sure which one people usually get.
—
Reply to this email directly, view it on GitHub
<#143 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAXOFITDV3VZF7Q2XDTTTATXUYYNXANCNFSM6AAAAAA3F7HNSU>
.
You are receiving this because you commented.Message ID:
<chengchingwen/Transformers.jl/repo-discussions/143/comments/6701914@
github.com>
--
dr tobe freeman - +41 79 377 8595
wordup development AG <https://wordupdevelopment.com/>
|
Beta Was this translation helpful? Give feedback.
Unfortunately, Transformers.jl right now does not support loading local weights & tokenizers. Currently you would need to call those functions that actually load the components from files and then construct the object yourself.