Skip to content

Latest commit

 

History

History
46 lines (27 loc) · 1.18 KB

README.md

File metadata and controls

46 lines (27 loc) · 1.18 KB

gpt-2-finetuning

This is a package for finetuning GPT-2 models.

It is based on the work done by:

Usage

Install the package

pip install gpt_2_finetuning

Download GPT-2 models through terminal

# Available model sizes: 124M, 355M, 774M
# Example command to download model
download_gpt2_model 124M

Example usage:

Generating samples from GPT-2 model

Finetuning GPT-2 model with Shakespeare's works

Further reading

Code from the paper "Language Models are Unsupervised Multitask Learners".

See more details in our blog post.

GPT-2 samples

WARNING: Samples are unfiltered and may contain offensive content.

License

MIT