Lixin Yang · Jian Xu · Licheng Zhong · Xinyu Zhan · Zhicheng Wang . Kejian Wu . Cewu Lu
POEM is designed for "reconstructing hand geometry from multi-view". It combines the structure-aware MANO mesh with the unstructured point cloud in the intersected cameras' frustum space.
To infer accurate 3D hand mesh from multi-view images, POEM introduce the cross point set attention.
It achieves the state-of-the-art performance on three multi-view Hand-Object Datasets: HO3D, DexYCB, OakInk.
- See docs/installation.md to setup the environment and install all the required packages.
- See docs/datasets.md to download all the datasets and data assets.
- set
${MODEL}
as one in[POEM, MVP, PEMeshTR, FTLMeshTR]
- set
${DATASET}
as one in[DexYCBMV, HO3Dv3MV, OakInkMV]
Download the pretrained checkpoints at 🔗 ckpt and move the contents to ./checkpoint
.
-g, --gpu_id
, visible GPUs for training, e.g.-g 0,1,2,3
. evaluation only supports single GPU.-w, --workers
, num_workers in reading data, e.g.-w 4
, recommend set-w
equals to-g
on HO3Dv3MV.-p, --dist_master_port
, port for distributed training, e.g.-p 60011
, set different-p
for different training processes.-b, --batch_size
, e.g.-b 32
, default is specified in config file, but will be overwritten if-b
is provided.--cfg
, config file for this experiment, e.g.--cfg config/release/${MODEL}_${DATASET}.yaml
.--exp_id
specify the name of experiment, e.g.--exp_id ${EXP_ID}
. When--exp_id
is provided, the code requires that no uncommitted change is remained in the git repo. Otherwise, it defaults to 'default' for training and 'eval_{cfg}' for evaluation. All results will be saved inexp/${EXP_ID}*{timestamp}
.--reload
, specify the path to the checkpoint (.pth.tar) to be loaded.
Specify the ${PATH_TO_CKPT}
to ./checkpoint/${MODEL}_${DATASET}/checkpoint/{xxx}.pth.tar
. Then, run:
# use "--eval_extra" for extra evaluation.
# "auc" compute AUC of the predicted mesh.
# "draw" draw the predicted mesh of each batch.
$ python scripts/eval.py --cfg config/release/${MODEL}_${DATASET}.yaml -g 0 -b 8 --reload ${PATH_TO_CKPT}
The evaluation results will be saved at exp/${EXP_ID}_{timestamp}/evaluations
.
$ python scripts/train_ddp.py --cfg config/release/${MODEL}_${DATASET}.yaml -g 0,1,2,3 -w 16
$ cd exp/${EXP_ID}_{timestamp}/runs/
$ tensorboard --logdir .
All the training checkpoints are saved at exp/${EXP_ID}_{timestamp}/checkpoints/
The code and model provided herein are available for usage as specified in LICENSE file. By downloading and using the code and model you agree to the terms in the LICENSE.
@inproceedings{yang2023poem,
author = {Yang, Lixin and Xu, Jian and Zhong, Licheng and Zhan, Xinyu and Wang, Zhicheng and Wu, Kejian and Lu, Cewu},
title = {POEM: Reconstructing Hand in a Point Embedded Multi-View Stereo},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2023},
pages = {21108-21117}
}
For more questions, please contact Lixin Yang: [email protected]