This is a fork of the Nerfstudio repository. It is used in the construction of Neural City Maps and Neural Elevation Models, projects by the Stanford NAV Lab.
Follow the instructions for Nerfstudio installation here up to "Dependencies" (create conda environment and install PyTorch and dependencies). Afterwards, clone the repo, switch to this branch, and install the nerfstudio package from source:
git clone https://github.com/Stanford-NavLab/nerfstudio.git
cd nerfstudio
git checkout adam/terrain
pip install --upgrade pip setuptools
pip install -e .
Then, install the terrain_nerf
package:
cd terrain_nerf
pip install -e .
ns-install-cli
We use simulated imagery from Google Earth Studio as well as real-world aerial drone imagery. Drone imagery is available at https://dronemapper.com/sample_data/ (we use the Red Rocks, Oblique dataset).
GES allows for rendering imagery of any location on Earth (and the Moon and Mars) using Google Earth. An account is needed.
- Generate GES
.esp
project file from lat/lon/alt usingscripts/generate_ges_traj.py
- e.g.,
python scripts/generate_ges_traj.py ges_traj.esp 37.333976 -121.8875317 200 --template Templates/Transamerica.json
- e.g.,
- Load the
.esp
file into GES (create blank project, then import)- Set total time to 10 seconds (with 30 FPS for a total of 300 frames), check "Scale existing keyframes"
- After setting settings, then render.
- Drag Google Earth logo to bottom right corner
- Use
scripts/ges2transforms.py
to generatetransforms.json
.- e.g.,
python scripts/ges2transforms.py ../nerfstudio_ws/GESSanJose/ san_jose.json 37.333976 -121.8875317
- e.g.,
Data preparation is identical to that of Nerfstudio.
- Create a
/data
folder within the repo. - For each scene, create a folder within
/data
(e.g.,/Scene01
). - Inside the scene folder, place imagery and a
transforms.json
file containing camera poses and parameters. If needed, use COLMAP or Nerfstudio'sns-process-data
to estimate camera poses.
As per Nerfstudio training procedure, run the following command:
ns-train terrain-nerfacto --data data/Scene01
and monitor training through Viser and/or Weights and Biases.
To save height field weights, use scripts/save_nemo_weights.py
.