You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Python wheels for macOS ARM are now built with the Ruy backend to support INT8 computation. This will change the performance and results when loading an INT8 model and/or using the auto compute type. To keep the previous behavior, set compute_type="float32".
New features
Support conversion of the GPT-J architecture
Support conversion of models using rotary position embeddings
Apply the new OpenNMT-py option decoder_start_token
Add option revision in the Transformers converter to download a specific revision of the model from the Hugging Face Hub