Mocapr
R functions to import, plot, animate, and analyse motion capture data.
Install / Use
/learn @steenharsted/MocaprREADME
mocapr
<!-- README.md is generated from README.Rmd. Please edit that file --> <!-- badges: start --> <!-- badges: end -->The goal of mocapr is to help researchers and clinicians to work with
motion capture data in R by providing functions that can import, plot,
animate, and analyse motion capture data.
mocapr supports import of .csv files from motion capture systems like:
the Captury,
FreeMoCap, and
OptiTrack, .trc files from
OpenCap, as well as kinematic output files
(.mot) from OpenCap.
While all functions should run without loading other libraries I
recommend you also load the tidyverse library(tidyverse) when loading
the mocapr library.
Installation
You can install the development version of mocapr from
GitHub with:
# install.packages("devtools")
devtools::install_github("steenharsted/mocapr")
You may have to install additional packages manually that are needed in
order to run gganimate functions.
Functions and objects in mocapr
- Import Functions:
import_captury_csv()import_freemocap_csv()import_optitrack_csv()import_opencap_trc()import_opencap_mot()
- Projection functions
project_full_body_to_AP()project_full_body_to_MP()
- Animation and plotting functions
animate_global()animate_anatomical()animate_movement()
- Helper functions for animations
align_movements()
- Kinematics in the anatomical frontal plane
add_frontal_plane_knee_angle()add_frontal_plane_projection_angle()add_frontal_plane_knee_deviation()add_knee_ankle_hip_ratios()
- Built in data sets
mocapr_datamocapr_synthetic_data
- Movement specific functions
add_jump_length_and_height()add_phases_jump()add_squat_events()
The functions and datasets have descriptions and examples that can be
explored using help(function_name) or ?function_name.
The intended workflow using the above functions and supplied datasets is
visualized below. 
The built in datasets
mocapr contains two sample data sets mocapr_data and
mocapr_synthetic_data
mocapr_data
mocapr_data consists of 11 movements, each supplied with a number
(movement_nr) and a short description (movement_description). Videos
of the movements with an overlay of the track is available at this
YouTube
playlist.
The videos are made using the CapturyLive software.
The data is also available as raw exports in .csv format, and can be found in the folder “data-raw”.
Lets load mocapr and inspect the mocapr_data:
library(tidyverse)
library(mocapr)
#Data
mocapr_data %>%
group_by(movement_nr, movement_description) %>%
nest()
#> # A tibble: 11 × 3
#> # Groups: movement_nr, movement_description [11]
#> movement_nr movement_description data
#> <dbl> <chr> <list>
#> 1 1 standing long jump for maximal performance <tibble>
#> 2 2 standing long jump for maximal performance <tibble>
#> 3 3 standing long jump with simulated poor landing techniqu… <tibble>
#> 4 4 vertical jump for maximal performance <tibble>
#> 5 5 gait normal in a straight line <tibble>
#> 6 6 gait normal in a semi square <tibble>
#> 7 7 gait with simulated drop foot <tibble>
#> 8 8 gait with simulated internal rotation <tibble>
#> 9 9 capoeira dance <tibble>
#> 10 10 forward lunge normal <tibble>
#> 11 11 squat normal <tibble>
The format of the data is wide and contains frame by frame joint angles and global joint center positions. Therefore, each joint is typically represented by 6 columns (3 angles and 3 positions). To prevent long repetitive column names, all joint related variables are abbreviated according to their side (L|R), joint(A|K|H|S|E|W), and angle|position.
| Side | Joint | Angle/Position | |:---|:---|:---| | | A (Ankle) | F (Flexion) | | L (left) | K (Knee) | Varus | | | H (Hip) | DF (Dorsi Flexion) | | | T (Toe) | | | R (Right) | W (Wrist) | X (joint center position on the global X axis (floor) | | | E (Elbow) | Y (joint center position on the global Y axis)(up) | | | S (Shoulder) | Z (joint center position on the global Z axis)(floor) |
Example for left knee:
| Abbreviated Variable Name | Meaning of abbreviation | |:--:|:--:| | LKF | Left Knee Flexion | | LKX | Left Knee joint center position on the X axis (floor plane) |
The focus of this tutorial is on plotting and animating motion capture data. For this we only need the joint center positions. I will not discuss the joint angles further, but feel free to explore them on your own.
mocapr_synthetic_data
Is artificial data generated via a script. It only contains spatial
joint-center positions in the anatomical planes. mocapr_synthetic_data
is intended to display how the frontal plane kinematics work, and also
to display sitations where they are likely to fail due to planar
cross-talk.
Animating Motion Capture Data With mocapr
Lets first create some sample data:
jump_1 <- filter(mocapr::mocapr_data, movement_nr == 2)
jump_2 <- filter(mocapr::mocapr_data, movement_nr == 3)
gait <- filter(mocapr::mocapr_data, movement_nr == 6)
capoeira <- filter(mocapr::mocapr_data, movement_nr == 9)
Animating with animate_global()
The global coordinate system refers to a 3D coordinate system (X, Y, and
Z axis) that is created and oriented during the setup and calibration of
many motion capture systems. Global joint center positions refer to the
position of a given joint center inside the global coordinate system.
The animate_global() function animates the subject using the global
joint center positions. It creates two animations: one in the X and Y
plane; and one in the Z and Y plane. If the subject is moving along
either the X or the Z axis the viewpoints will essentially be a side
view and a front|back view.
jump_1 %>%
animate_global(
# gganimate options passed via ...
nframes = nrow(.),
fps = 50)
<img src="man/figures/README-jump_1_GP-1.gif" width="100%" />
If the recorded subject moves out of the primary planes of the global
coordinate system, animations and plots using global joint center
positions will appear skewed or tilted. This is what we refer to as
‘out-of-plane movement’. For instance, jump_2 simulates a poor landing
on the right knee, with the direction of the movement occurring out of
the primary planes (X and Z axes) in the global coordinate system.
Consequently, using the animate_global() function on jump_2 produces
an animation that exhibits this out-of-plane movement, which could
potentially make the animation more challenging to interpret.
jump_2 %>%
animate_global(
# gganimate options passed via ...
nframes = nrow(.),
fps = 50)
<img src="man/figures/README-jump_2_GP-1.gif" width="100%" />
Dealing with out-of-plane movement
Animating with animate_movement() and animate_anatomical()
In many instances, it’s straightforward to prevent out-of-plane movement
and oblique viewpoints, making the animate_global() function
sufficient. However, there are scenarios, such as when working with
pre-school children, where preventing movement that deviates from the
primary axes is challenging without interfering with the subject’s
spontaneous movements. Oblique viewpoints can also arise from variations
in the orientation—such as rotational or translational differences—of
the global coordinate system between different motion capture systems or
setups.
For the purpose of analyzing or interpreting motions, out-of-plane movement can distort the perception of movement. This necessitates animation and plotting functions that are independent of the orientation of the global coordinate system, focusing instead on the subject itself or the direction of the movement the subject is performing.
mocapr addresses this challenge by providing two functions:
project_full_body_to_MP() and project_full_body_to_AP(). The former
projects the global joint center positions onto the planes aligned with
the direction of movement, while the latter projects them onto the
anatomical planes of the subject. Essentially, these functions create
new coordinate systems that are shifted or tilted versions of the global
coordinate system.
As a result, animations that use the joint center positions in these new
coordinate systems, such as those created by animate_movement() or
animate_anatomical(), will have viewpoints that are directly in front
of and to the side of the direction of the movement or the person. This
allows for a more accurate and intuitive interpretation of the motion.
The direction of the movement is determined by the position of the subject at the first and the last frame of the recording.
These functions are best explained by looking at animations.
Lets look again at `ju
Related Skills
node-connect
343.1kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
90.0kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
343.1kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
343.1kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
