Pykinematics
Joint angle estimation methods for both optical motion capture and IMU/MIMU based motion capture
Install / Use
/learn @M-SenseResearchGroup/PykinematicsREADME
pykinematics
pykinematics is an open-source Python package for estimating hip kinematics using both novel Magnetic and Inertial
Measurement Unit (MIMU) wearable sensors and existing Optical Motion Capture (OMC) algorithms. The novel MIMU algorithms
have been validated against OMC, and include novel methods for estimating sensor-to-sensor relative orientation and
sensor-to-segment alignment.
Documentation
Documentation including the below examples, and the API reference can be found at pykinematics documentation
Requirements
- Python >=3.6
- Numpy
- Scipy
- h5py*
pip should automatically collect any uninstalled dependencies.
* h5py is required to run the example code in /scripts/example_code.py, as the sample data
provided (see Example Usage) is stored in the .hdf format. Pip will not catch and install
h5py as it is not used by pykinematics, and must be installed manually to run the example code.
pip install h5py
or if using Anaconda
conda install -c anaconda h5py
Installation
pykinematics can be installed using pip:
pip install pykinematics
Alternatively, you can clone this repository and install from source.
pykinematics can be uninstalled by running
pip uninstall pykinematics
Running tests
Tests are implemented with pytest, and can be automatically run with:
pytest --pyargs pykinematics.tests
Optionally add -v to increase verbosity.
If you don't want to run the integration tests (methods tests), use the following:
python -m pykinematics.tests --no-integration
If you want to see coverage, the following can be run (assuming coverage is installed):
coverage run -m pytest --pyargs pykinematics.tests
# generate the report
coverage report
# generate a HTML report under ./build/index.html
coverage html
Example Usage
A full example script can be found in /scripts/example_code.py. This requires a sample
data file, which can be downloaded from Sample Data
example_code.py contains a helper function to load the data into Python.
Once the data is imported, the bulk of the processing is simple:
import pykinematics as pk
static_calibration_data, star_calibration_data, walk_fast_data = <loaded sample data>
# define some additional keyword arguments for optimizations and orientation estimation
filt_vals = {'Angular acceleration': (2, 12)}
ka_kwargs = {'opt_kwargs': {'method': 'trf', 'loss': 'arctan'}}
jc_kwargs = dict(method='SAC', mask_input=True, min_samples=1500, opt_kwargs=dict(loss='arctan'), mask_data='gyr')
orient_kwargs = dict(error_factor=5e-8, c=0.003, N=64, sigma_g=1e-3, sigma_a=6e-3)
mimu_estimator = pk.ImuAngles(gravity_value=9.8404, filter_values=filt_vals, joint_center_kwargs=jc_kwargs,
orientation_kwargs=orient_kwargs, knee_axis_kwargs=ka_kwargs)
# calibrate the estimator based on Static and Star Calibration tasks
mimu_estimator.calibrate(static_calibration_data, star_calibration_data)
# compute the hip joint angles for the Fast Walking on a treadmill
left_hip_angles, right_hip_angles = mimu_estimator.estimate(walk_fast_data, return_orientation=False)
Right hip angles from the sample data for walking fast:

Related Skills
node-connect
342.0kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
84.7kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
342.0kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
commit-push-pr
84.7kCommit, push, and open a PR
