-
Notifications
You must be signed in to change notification settings - Fork 26.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for models trained using MLX #29333
Comments
As additional motivation: this would enable fine-tuning locally with MLX -> save in safetensors -> load model in transformers which would be 🔥 |
fyi @pcuenca |
Thanks for the ping @ArthurZucker! I'm supportive. Just opened #29511 to further examine all the implications. I forgot to add @alexweberk as a co-author as he already identified the path to follow, I'll do it in another commit (or feel free to take over @alexweberk and lead the work)! |
Feature request
It would be great if model weights trained and saved through MLX could be easily loaded using the Transformers library. This way, MLX users and Transformers users can more freely collaborate when making open source models.
Motivation
Currently, Transformers only supports safetensors files with metadata for format in
["pt", "tf", "flax"]
. If Transformers can add"mlx"
to the list, and if MLX can addmetadata={"format":"mlx"}
when saving safetensors, this can be achieved.Here is a related issue I created on the
mlx_lm
side:ml-explore/mlx#743 (comment)
Here is a related PR on the
mlx_lm
repo: ml-explore/mlx-examples#496Your contribution
I can send a PR to modify the exact code shortly after.
The text was updated successfully, but these errors were encountered: