Skip to content

Official repo for PhysAvatar: Learning the Physics of Dressed 3D Avatars from Visual Observations, ECCV 2024

License

Notifications You must be signed in to change notification settings

y-zheng18/PhysAvatar

Repository files navigation

PhysAvatar: Learning the Physics of Dressed 3D Avatars from Visual Observations

teaser [Paper] [Project Page]

News

  • Released the code for mesh tracking, garment physical parameter estimation, and test time animation.
  • Release the code for apperance fitting and rendering.

TODO

  • Release the tutorial for animating the character using MIXAMO data.

Quick Start

Installation

We suggest to use conda with mamba to set up the environment. The following commands will create a new conda environment with required dependencies installed.

mamba env create -f environment.yml
conda activate phys_avatar
# install gaussian rasterization
git clone https://github.com/JonathonLuiten/diff-gaussian-rasterization-w-depth.git
cd diff-gaussian-rasterization-w-depth
python setup.py install
pip install .
# install Codim-IPC
cd Codim-IPC
python build.py

Download our pre-processed data (cloth mesh, fitted SMPLX model) for Actor1 from ActorsHQ dataset. Download ActorsHQ dataset (Actor1, Sequence 1, 4x downsampling videos) under the data folder. Download SMPLX npz and pkl files and VPoser pretrained weights, and put them under data folder, following the structure as data/body_models/smplx/*.npz and data/body_models/TR00_E096.pt.

Mesh Tracking

bash scripts/train_mesh_lbs.sh

We suggest using wandb to visualize the training process. Replace --wandb_entity xxxx in the bash file with your wandb entity.

Garment Physical Parameter Estimation

We first extract the garment mesh from the mesh tracking results:

python extract_cloth.py --train_dir ./output/exp1_cloth/a1_s1_460_200 --seq a1_s1 --cloth_name cloth_sim.obj

Then we estimate the physical parameters:

bash scripts/phys_param_estimation.sh

Note that in the simulation we manually segment out the garment from the fullbody mesh (data/a1_s1/cloth_sim) and define the boundry condition points which drives the simulation. We provide the boundry condition points (data/a1_s1/dress_v.txt) for Actor1.

If you are working on custom data, you need to prepare those files yourself. Following data preparation for more details.

Animation

We first compute the LBS weights for the fullbody mesh using algorithm described in Robust Skin Weights Transfer via Weight Inpainting:

python lbs_weights_inpainting.py

The optimized weights are saved in data/a1_s1/optimized_weights.npy. We use the weights to animate human body, and the garment dynamics are simulated by Codim-IPC. For motions from ActorsHQ dataset, run:

python run_sim_actorhq.py

For motion in AMASS dataset, run:

python run_sim_amass.py --motion_path ./data/AMASS/MoSh/50020/shake_hips_stageii.npz --frame_num 50

Inverse Rendering

check pbr/README.md for more details.

Citation

If you use this code or our data for your research, please cite:

PhysAvatar: Learning the Physics of Dressed 3D Avatars from Visual Observations. Yang Zheng, Qingqing Zhao, Guandao Yang, Wang Yifan, Donglai Xiang, Florian Dubost, Dmitry Lagun, Thabo Beeler, Federico Tombari, Leonidas Guibas, Gordon Wetzstein. In ECCV 2024.

Bibtex:

@inproceedings{PhysAavatar24,
    title={PhysAvatar: Learning the Physics of Dressed 3D Avatars from Visual Observations},
    author={Yang Zheng and Qingqing Zhao and Guandao Yang and Wang Yifan and Donglai Xiang and Florian Dubost and Dmitry Lagun and Thabo Beeler and Federico Tombari and Leonidas Guibas and Gordon Wetzstein}
    journal={European Conference on Computer Vision (ECCV)},
    year={2024}
}

About

Official repo for PhysAvatar: Learning the Physics of Dressed 3D Avatars from Visual Observations, ECCV 2024

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published