Skip to content

classify human actions using pose estimation with tflite and scikit-learn

Notifications You must be signed in to change notification settings

abdul756/ActionAI

 
 

Repository files navigation

ActionAI

ActionAI is a python library for training machine learning models to classify human action. It is a generalization of our yoga smart personal trainer, which is included in this repo as an example.

Dependencies

  • tensorflow 2.0
  • scikit-learn
  • opencv
  • pandas
  • pillow

Data Prep

Arrange your image data as a directory of subdirectories, each subdirectory named as a label for the images contained in it. Your directory structure should look like this:

├── images_dir
│   ├── class_1
│   │   ├── sample1.png
│   │   ├── sample2.jpg
│   │   ├── ...
│   ├── class_2
│   │   ├── sample1.png
│   │   ├── sample2.jpg
│   │   ├── ...
.   .
.   .

Samples should be standard image files recognized by the pillow library.

To generate a dataset from your images, run the data_generator.py script.

python data_generator.py

This will stage the labeled image dataset in a csv file written to the data/ directory.

Training

After reading the csv file into a dataframe, a custom scikit-learn transformer estimates body keypoints to produce a low-dimensional feature vector for each sample image. This representation is fed into a scikit-learn classifier set in the config file.

Run the train.py script to train and save a classifier

python train.py

The pickled model will be saved in the models/ directory

Run

We've provided a sample inference script, inference.py, that will read input from a webcam, mp4, or rstp stream, run inference on each frame, and print inference results.

About

classify human actions using pose estimation with tflite and scikit-learn

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 74.8%
  • Python 24.5%
  • Shell 0.7%