Our approach entails the utilization of a pre-trained NeRF model, trained on one or more generic objects, to subsequently train the object of interest. The pre-trained NeRF acquires general features and representations pertinent to one or more object categories, thereby diminishing the number of epochs required for training the new object from scratch. This results in a significant reduction in both time and computational resources.
Please refer to Report and Project Video for more details.
Please create a virtual environment:-
python3 -m venv two_nerf
source ./two_nerf/bin/activate
Install the libraries:
python3 -m pip install -r ./requirements.txt
You can find the .npz
files in dataset
folder. Each .npz
file contains 800 images of an object taken from the ShapeNet
dataset. There are different types of objects, such as caps, tables, cars, etc. You can copy the
required .npz
file to train folder using the following:
cp ./dataset/cars.npz ./data/train/
This is only an example. Feel free to use one or more .npz
files to train the Two-Phase NeRF.
We can use the conf.yaml
file to set up all required training and testing parameters. Following describes the configuration parameter.
-
model
- Determines the type of NeRF we want to train.tiny_nerf
indicates original tiny nerf module andtwo_phase_nerf
indicates the nerf module we introduced. -
train_dir
- Path to the directory which holds.npz
files used for training. It can contain one or multiple files. -
test_dir
- Path to the directory which holds.npz
file used for testing. It can contain only one file. -
test_only
- When set toTrue
, we load the saved weights for pretext model and perform only downstream task. -
num_iters_first
- Sets the number of run iterations for pretext/ first training in two phase module. -
num_iters_second
- Sets the number of run iterations for object specific training in two phase module (second phase). It also indicates the iterations for tiny_nerf. -
img_output_every
- Indicates the frequency at which test view validation occurs and stores output and PSNR plots inmodel/results
folder.
The following configuration files have been set up for training:
conf_baseline.yaml
- Trains the baseline tiny_nerf.conf.yaml
- Trains a two_phase_nerf with a single category pretext and downstream training.conf_test.yaml
- Performs downstream training on two_phase_nerf with single category pretextconf_multi_cat.yaml
- Trains a two_phase_nerf with a multi category pretext and downstream training.
Please create the data/train
, data/test
folders and choose the conf.yaml
file with desired parameters.
To run the code:
python3 ./main.py conf.yaml
Results can be viewed in {model}/results
. The model stores, image output and PSNR plots at img_output_every
intervals.
We have added a few utility functions in utils/
. The files are as follows:
generate_npz_from_shapenet_data
- Generates.npz
files from ShapeNet dataset.visualize_shapenet_images
- Generates images from.npz
files for visualization.
- Krish Rewanth Sevuga Perumal
- Manas Sharma
- Ritika Kishore Kumar
- Sanidhya Singal