Human pose similarity cannot be described well in terms of absolute location of body parts. One can instead generate relational pose features that capture relationship between position and orientation of body parts. These features have been successfully used for action recognition [1] and mocap retrieval [3].
The original code is written by Hueihan Jhuang. Here, I provide some speed-ups and an interface to generate these features from a) FMP [2] output and b) motion capture data (as seen from a given angle). For more details see [1].
% Add files to path.
init_rel_pose_feats
- From FMP output .
% Load fmp output already saved in a mat file.
fmp_data = load('query001.mat');
pose2d = fmp_data.pose2d;
num_frames = numel(pose2d);
% These parameters are the same as described in the paper [1].
opt = struct('T', 5, 's', 2);
% Each of these output matrices contain different relational feature types.
[norm_pos, dist_rel, angle_rel, ort_rel, cart_traj, radial_traj, dist_rel_traj,...
angle_rel_traj, ort_rel_traj] = pose_desc_fmp_raw(pose2d, 1:num_frames, opt);
- From orthographic projection of motion capture data.
% Read a mocap file (in BVH format).
[imocap, ~] = load_imocap_seq( '12_02', BASE_PATH, FPS);
% Pick parameters.
frame_range = 1:100;
% We assume orthographic projection.
theta = pi/3; % Elevation angle measured from the vertical.
phi = pi/2; % Azimuthal angle.
opt = struct('T', 5, 's', 2);
[norm_pos, dist_rel, angle_rel, ort_rel, cart_traj, radial_traj, dist_rel_traj,...
angle_rel_traj, ort_rel_traj] = pose_desc_imocap_raw(imocap, frame_range, ...
theta, phi, opt);
-
demo_compute_pose by Hueihan Jhuang, available here.
-
mocap-dense-trajectories available on Github.
- Hueihan Jhuang; Gall, J.; Zuffi, S.; Schmid, C.; Black, M.J., "Towards Understanding Action Recognition," in ICCV, 2013
- Yi Yang; Ramanan, D., "Articulated Human Detection with Flexible Mixtures of Parts," TPAMI, 2013
- Ankur Gupta; He, J.; Martinez J.; Little J.J.; Woodham R.J., "Efficient video-based retrieval of human motion with flexible alignment," in WACV, 2016