Code for our CVPR'23 paper: "Polynomial Implicit Neural Representations For Large Diverse Datasets"
The libraries are burrowed from the StyleGAN-XL repository. Big thanks to the authors for the wonderful code.
- 64-bit Python 3.8 and PyTorch 1.9.0 (or later)
- CUDA toolkit 11.1 or later.
- GCC 7 or later compilers.
- Use the following commands with Miniconda3 to create and activate your Python environment:
conda env create -f environment.yml
conda activate polyinr
python dataset_tool.py --source=./data/location --dest=./data/dataname_256.zip --resolution=256x256 --transform=center-crop
python train.py --outdir=./training-runs/dataname --data=./data/dataname_32.zip --gpus=4 --batch=64 --mirror=1 --snap 10 --batch-gpu 8 --kimg 10000
python train.py --outdir=./training-runs/dataname --data=./data/dataname_64.zip --gpus=4 --batch=64 --mirror=1 --snap 10 --batch-gpu 8 --kimg 10000
--superres --path_stem training-runs/dataname/00000-gmgan-dataname_32-gpus8-batch64/best_model.pkl
python gen_images.py --outdir=out --trunc=0.6 --seeds=1-20 --batch-sz 1 --class 135 --network=path/to/best_model.pkl