Skip to content

Latest commit

 

History

History
116 lines (86 loc) · 4.29 KB

README.md

File metadata and controls

116 lines (86 loc) · 4.29 KB

POEM: Reconstructing Hand in a Point Embedded Multi-view Stereo

Lixin Yang · Jian Xu · Licheng Zhong · Xinyu Zhan · Zhicheng Wang . Kejian Wu . Cewu Lu

CVPR 2023

Logo


Paper PDF

POEM is designed for "reconstructing hand geometry from multi-view". It combines the structure-aware MANO mesh with the unstructured point cloud in the intersected cameras' frustum space. To infer accurate 3D hand mesh from multi-view images, POEM introduce the cross point set attention. It achieves the state-of-the-art performance on three multi-view Hand-Object Datasets: HO3D, DexYCB, OakInk.

🕹️ Instructions

 

🏃 Training and Evaluation

Available models

  • set ${MODEL} as one in [POEM, MVP, PEMeshTR, FTLMeshTR]
  • set ${DATASET} as one in [DexYCBMV, HO3Dv3MV, OakInkMV]

Download the pretrained checkpoints at 🔗 ckpt and move the contents to ./checkpoint.

Command line arguments

  • -g, --gpu_id, visible GPUs for training, e.g. -g 0,1,2,3. evaluation only supports single GPU.
  • -w, --workers, num_workers in reading data, e.g. -w 4, recommend set -w equals to -g on HO3Dv3MV.
  • -p, --dist_master_port, port for distributed training, e.g. -p 60011, set different -p for different training processes.
  • -b, --batch_size, e.g. -b 32, default is specified in config file, but will be overwritten if -b is provided.
  • --cfg, config file for this experiment, e.g. --cfg config/release/${MODEL}_${DATASET}.yaml.
  • --exp_id specify the name of experiment, e.g. --exp_id ${EXP_ID}. When --exp_id is provided, the code requires that no uncommitted change is remained in the git repo. Otherwise, it defaults to 'default' for training and 'eval_{cfg}' for evaluation. All results will be saved in exp/${EXP_ID}*{timestamp}.
  • --reload, specify the path to the checkpoint (.pth.tar) to be loaded.

Evaluation

Specify the ${PATH_TO_CKPT} to ./checkpoint/${MODEL}_${DATASET}/checkpoint/{xxx}.pth.tar. Then, run:

# use "--eval_extra" for extra evaluation.
#   "auc"            compute AUC of the predicted mesh.
#   "draw"           draw the predicted mesh of each batch.

$ python scripts/eval.py --cfg config/release/${MODEL}_${DATASET}.yaml -g 0 -b 8 --reload ${PATH_TO_CKPT}

The evaluation results will be saved at exp/${EXP_ID}_{timestamp}/evaluations.

Training

$ python scripts/train_ddp.py --cfg config/release/${MODEL}_${DATASET}.yaml -g 0,1,2,3 -w 16

Tensorboard

$ cd exp/${EXP_ID}_{timestamp}/runs/
$ tensorboard --logdir .

Checkpoint

All the training checkpoints are saved at exp/${EXP_ID}_{timestamp}/checkpoints/

 

License

The code and model provided herein are available for usage as specified in LICENSE file. By downloading and using the code and model you agree to the terms in the LICENSE.

Citation

@inproceedings{yang2023poem,
    author    = {Yang, Lixin and Xu, Jian and Zhong, Licheng and Zhan, Xinyu and Wang, Zhicheng and Wu, Kejian and Lu, Cewu},
    title     = {POEM: Reconstructing Hand in a Point Embedded Multi-View Stereo},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2023},
    pages     = {21108-21117}
}

For more questions, please contact Lixin Yang: [email protected]