Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for models trained using MLX #29333

Closed
alexweberk opened this issue Feb 28, 2024 · 3 comments · Fixed by #29335 or #29511
Closed

Add support for models trained using MLX #29333

alexweberk opened this issue Feb 28, 2024 · 3 comments · Fixed by #29335 or #29511

Comments

@alexweberk
Copy link
Contributor

Feature request

It would be great if model weights trained and saved through MLX could be easily loaded using the Transformers library. This way, MLX users and Transformers users can more freely collaborate when making open source models.

Motivation

Currently, Transformers only supports safetensors files with metadata for format in ["pt", "tf", "flax"]. If Transformers can add "mlx" to the list, and if MLX can add metadata={"format":"mlx"} when saving safetensors, this can be achieved.

Here is a related issue I created on the mlx_lm side:
ml-explore/mlx#743 (comment)

Here is a related PR on the mlx_lm repo: ml-explore/mlx-examples#496

Your contribution

I can send a PR to modify the exact code shortly after.

@awni
Copy link

awni commented Feb 28, 2024

As additional motivation: this would enable fine-tuning locally with MLX -> save in safetensors -> load model in transformers which would be 🔥

@ArthurZucker
Copy link
Collaborator

fyi @pcuenca

@pcuenca
Copy link
Member

pcuenca commented Mar 7, 2024

Thanks for the ping @ArthurZucker! I'm supportive. Just opened #29511 to further examine all the implications. I forgot to add @alexweberk as a co-author as he already identified the path to follow, I'll do it in another commit (or feel free to take over @alexweberk and lead the work)!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants