Skip to content

Keras implementation of Realtime Multi-Person Pose Estimation

Notifications You must be signed in to change notification settings

aniketzz/keras-openpose-reproduce

 
 

Repository files navigation

keras-openpose-reproduce

This is a keras implementation of Realtime Multi-Person Pose Estimation.

Prerequisites

  1. Keras and Tensorflow (tested on Linux machine)
  2. Python3
  3. GPU with at least 11GB memory
  4. More than 250GB of disk space for training data

Please also install the following packages:

$ sudo apt-get install libboost-all-dev libhdf5-serial-dev libzmq3-dev libopencv-dev python-opencv python3-tk python-imaging
$ sudo pip3 install Cython scikit-image pandas zmq h5py opencv-python IPython configobj

Download COCO 2014 Dataset

Please download the COCO dataset and the official COCO evaluation API. Go to folder dataset and simply run the following commands:

$ cd dataset
$ ./step1_download_coco2014.sh
$ ./step2_setup_coco_api.sh

Prepare Training Data

Before model training, we convert the images to the specific data format for efficient training. We generate the heatmaps, part affinity maps, and then convert them to HDF5 files. Go to the folder training, and run the scripts. The process takes around 2 hours.

$ cd training
$ python3 generate_masks_coco2014.py
$ python3 generate_hdf5_coco2014.py

After this, you will generate train_dataset_2014.h5 and val_dataset_2014.h5. The files are about 182GB and 3.8GB, respectively.

Training

Simply go to folder training and run the training script:

$ cd training
$ python3 train_pose.py

Evaluation on COCO Keypoint Datasets

Please go to folder eval and run the evaluation script. eval_model=0: single-scale evaluation. eval_model=1: multi-scale evaluation (as described in Openpose's paper).

$ cd eval
$ python3 eval_coco2014_multi_modes.py --eval_method 0
$ python3 eval_coco2014_multi_modes.py --eval_method 1

Evaluation Summary

We empirically trained the model for 100 epochs (2 weeks) and achieved comparable performance to the results reported in the original paper. We also compared with the original implementation which is online avialable. Note that the validation list COCO2014-Val-1K is provided by the official Openpose.

Method Validation AP
Openpose paper COCO2014-Val-1k 58.4
Openpose model COCO2014-Val-1k 56.3
This repo COCO2014-Val-1k 58.2

We also evaluated the performance on the full COCO2014 validation set.

Method Validation AP
Openpose model COCO2014-Val 58.9
This repo COCO2014-Val 59.0

You may find our trained model at: Dropbox

You may also find our prediction results on COCO2014 validation (json format w/o images): Dropbox

Acknowledgment

This repo is based upon @anatolix's repo keras_Realtime_Multi-Person_Pose_Estimation, and @michalfaber's repo keras_Realtime_Multi-Person_Pose_Estimation

Citation

Please cite the paper in your publications if it helps your research:

@inproceedings{cao2017realtime,
  author = {Zhe Cao and Tomas Simon and Shih-En Wei and Yaser Sheikh},
  booktitle = {CVPR},
  title = {Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields},
  year = {2017}
  }

@inproceedings{wei2016cpm,
  author = {Shih-En Wei and Varun Ramakrishna and Takeo Kanade and Yaser Sheikh},
  booktitle = {CVPR},
  title = {Convolutional pose machines},
  year = {2016}
  }

About

Keras implementation of Realtime Multi-Person Pose Estimation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 98.8%
  • Python 1.2%