Skip to content

Releases: ikergarcia1996/Easy-Translate

Easy-Translate 2.1 Release

30 Nov 13:37
Compare
Choose a tag to compare

SeamlessM4T

Add support for SeamlessM4T: A collection of models designed to provide high quality translation, allowing people from different linguistic communities to communicate effortlessly through speech and text. It was introduced in this paper and first released in this repository.

Usage

# Medium Model
python3 translate.py \
--sentences_path sample_text/en.txt \
--output_path sample_text/en2es.translation.seamless-m4t-medium.txt \
--source_lang eng \
--target_lang spa \
--model_name facebook/hf-seamless-m4t-medium


# Large Model
python3 translate.py \
--sentences_path sample_text/en.txt \
--output_path sample_text/en2es.translation.seamless-m4t-large.txt \
--source_lang eng \
--target_lang spa \
--model_name facebook/hf-seamless-m4t-large

Translate all the files in a directory

If you want to translate all the files in a directory, use the new --sentences_dir flag instead of --sentences_path. We will load the model only once!

# We use --files_extension txt to translate only files with this extension. 
# Use empty string to translate all files in the directory

python3 translate.py \
--sentences_dir sample_text/ \
--output_path sample_text/translations \
--files_extension txt \
--source_lang en \
--target_lang es \
--model_name facebook/m2m100_1.2B

Easy-Translate 2.0 Release

18 Jun 18:37
ad85b8c
Compare
Choose a tag to compare

I have implemented multiple upgrades to Easy-Translate. Don't worry, Easy-Translate 2.0 is fully retrocompatible with the previous version. You don't need to change anything! Easy-Translate 2.0 is as "Easy" as it has always been. It has been designed for translating large text files with just a single command. Easy-Translate 2.0 now supports:

  • Load huge models in a single GPU with 8-bits / 4-bits quantization and support for splitting the model between GPU and CPU. See Loading Huge Models for more information.
  • LoRA models support
  • Support for any Seq2SeqLM or CausalLM model from HuggingFace's Hub.
  • Prompt support! See Prompting for more information.

See Readme.md for more info