Skip to content

heartStrive/HandGCAT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HandGCAT: Occlusion-Robust 3D Hand Mesh Reconstruction from Monocular Images

This is an official pytorch implementation of the ICME 2023 paper HandGCAT: Occlusion-Robust 3D Hand Mesh Reconstruction from Monocular Images. In this repository, we provide PyTorch code for training and testing the proposed HandGCAT on the HO3D and Dexycb dataset. image

Install

Install PyTorch and Python >= 3.7.4 and run sh requirements.sh.

Directory

The ${ROOT} is described as below.

${ROOT}  
|-- data  
|-- demo
|-- common  
|-- main  
|-- output  

data contains data loading codes and soft links to images and annotations directories. demo contains demo codes. common contains kernel codes for HandOccNet. main contains high-level codes for training or testing the network. output contains log, trained models, visualized outputs, and test result.

Data

You need to follow directory structure of the data as below.

${ROOT}  
|-- data  
|   |-- HO3Dv2
|   |   |-- data
|   |   |   |-- train
|   |   |   |   |-- ABF10
|   |   |   |   |-- ......
|   |   |   |-- evaluation
|   |   |   |-- annotations
|   |   |   |   |-- HO3D_train_data.json
|   |   |   |   |-- HO3D_evaluation_data.json
|   |-- HO3Dv3
|   |   |-- data
|   |   |   |-- train
|   |   |   |   |-- ABF10
|   |   |   |   |-- ......
|   |   |   |-- evaluation
|   |   |   |-- annotations
|   |   |   |   |-- HO3D_train_data.json
|   |   |   |   |-- HO3D_evaluation_data.json
|   |-- DEX_YCB
|   |   |-- data
|   |   |   |-- 20200709-subject-01
|   |   |   |-- ......
|   |   |   |-- annotations
|   |   |   |   |--DEX_YCB_s0_train_data.json
|   |   |   |   |--DEX_YCB_s0_test_data.json

Reference

If this repository is helpful to you, please star it. If you find our work useful in your research, please consider citing:

@article{HandGCAT2023,
  title={HandGCAT: Occlusion-Robust 3D Hand Mesh Reconstruction from Monocular Images},
  author={Wang, Shuaibing and Wang, Shunli and Yang, Dingkang and Li, Mingcheng and Qian, Ziyun and Su, Liuzhen and Zhang, Lihua},
  journal={2023 IEEE International Conference on Multimedia & Expo},
  year={2023}
}

Acknowledgements

Some of the code is borrowed from the HandOccNet project. We are very grateful for their wonderful implementation.

Contact

If you have any questions about our work, please contact [email protected].

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published