This is an official pytorch implementation of the ICME 2023 paper HandGCAT: Occlusion-Robust 3D Hand Mesh Reconstruction from Monocular Images. In this repository, we provide PyTorch code for training and testing the proposed HandGCAT on the HO3D and Dexycb dataset.
Install PyTorch and Python >= 3.7.4 and run sh requirements.sh.
The ${ROOT} is described as below.
${ROOT}
|-- data
|-- demo
|-- common
|-- main
|-- output
data contains data loading codes and soft links to images and annotations directories. demo contains demo codes. common contains kernel codes for HandOccNet. main contains high-level codes for training or testing the network. output contains log, trained models, visualized outputs, and test result.
You need to follow directory structure of the data as below.
${ROOT}
|-- data
| |-- HO3Dv2
| | |-- data
| | | |-- train
| | | | |-- ABF10
| | | | |-- ......
| | | |-- evaluation
| | | |-- annotations
| | | | |-- HO3D_train_data.json
| | | | |-- HO3D_evaluation_data.json
| |-- HO3Dv3
| | |-- data
| | | |-- train
| | | | |-- ABF10
| | | | |-- ......
| | | |-- evaluation
| | | |-- annotations
| | | | |-- HO3D_train_data.json
| | | | |-- HO3D_evaluation_data.json
| |-- DEX_YCB
| | |-- data
| | | |-- 20200709-subject-01
| | | |-- ......
| | | |-- annotations
| | | | |--DEX_YCB_s0_train_data.json
| | | | |--DEX_YCB_s0_test_data.json
If this repository is helpful to you, please star it. If you find our work useful in your research, please consider citing:
@article{HandGCAT2023,
title={HandGCAT: Occlusion-Robust 3D Hand Mesh Reconstruction from Monocular Images},
author={Wang, Shuaibing and Wang, Shunli and Yang, Dingkang and Li, Mingcheng and Qian, Ziyun and Su, Liuzhen and Zhang, Lihua},
journal={2023 IEEE International Conference on Multimedia & Expo},
year={2023}
}
Some of the code is borrowed from the HandOccNet project. We are very grateful for their wonderful implementation.
If you have any questions about our work, please contact [email protected].