This is a package for finetuning GPT-2 models.
It is based on the work done by:
- OpenAI's official GPT-2 repository
- Finetuning functionality from nshepperd's fork of the official GPT-2 repository
Install the package
pip install gpt_2_finetuning
Download GPT-2 models through terminal
# Available model sizes: 124M, 355M, 774M
# Example command to download model
download_gpt2_model 124M
Example usage:
Generating samples from GPT-2 model
Finetuning GPT-2 model with Shakespeare's works
Code from the paper "Language Models are Unsupervised Multitask Learners".
See more details in our blog post.
WARNING: Samples are unfiltered and may contain offensive content. |
---|