WDK
The Wearables Development Toolkit - a development environment for activity recognition applications with sensor signals
Install / Use
/learn @avenix/WDKREADME
Wearables Development Toolkit (WDK)
The Wearables Development Toolkit (WDK) is a framework and set of tools to facilitate the iterative development of activity recognition applications with wearable and IoT devices. It supports the annotation of time series data, the analysis and visualization of data to identify patterns and the development and performance assessment of activity recognition algorithms. At the core of the WDK is a repository of high-level components that encapsulate functionality used across activity recognition applications. These components can be used within a Matlab script or within a visual flo-based programming platform (i.e. Node-RED).
<p align="center"> <img width="700" src="doc/images/ARCDevelopment.png"> </p>To get a first insight into the WDK, watch this demo video: https://www.youtube.com/embed/Ow0b0vkciDs and read my paper.
To install and use the WDK, refer to the Documentation.
1- Data Annotation
An annotated data set is needed to train a machine learning algorithm and to assess its performance. The Data Annotation App offers functionality to annotate time series data. Depending on the particular application, one might want to annotate events that occurr at a specific moment in time or activities that have a duration in time, called ranges. The following image shows the squared magnitude of the accelerometer signal collected by a motion sensor attached at a hind leg of a cow. The individual strides of the cow have been annotated as event annotations (red) and the walking and running activities as ranges (black rectangles).

Annotating with video (optional)
The Data Annotation App can load and display videos next to the data wich is synchronized by specifying at least two data samples and two video frames that correspond to the same event in time. The frames in the video file are displayed by the Movie Player at the bottom right of the window:

In this application, we asked the subject to applaud three times in front of the camera while wearing an armband with an Inertial Measurement Unit (IMU). We matched the samples at the peak of squared magnitude of acceleration to the video frames where the subject's hands make contact with each other.
Please note:
- The Data Annotation App synchronises video and data at two points and interpolates linearly inbetween. I recommend the synchronization points to take place in the beginning and end of a recording.
- Annotation, marker, synchronization and video files should be consistent with the data files. If a data file is named 'S1.mat', its annotation file should be named 'S1-annotations.txt', its synchronization file 'S1-synchronization.txt' and the video 'S1-video.<extension>'.
- By default, the Data Annotation App loads annotation files from the './data/annotations/', video and synchronization files from './data/videos' directory. Saved annotation files are located in the root './' directory.
- The labels to annotate should be defined in the 'labels.txt' file beforehand.
- You can use the keyboard shortcuts arrow-right, arrow-left and spacebar to iterate through data and video.
Automatic Annotation (optional)
The Data Annotation App offers two features to facilitate the annotation of sensor signals.
Unsupervised automatic annotation
The unsupervised feature analyzes the entire data set, clusters similar portions of data and suggests annotations for some of the portions of data. This feature requires a segmentation algorithm to be provided by application developers. A set of heuristic features are extracted for each segment extracted. Finally kmeans is used to cluster the feature vectors using as many labels as defined in the current project. The N feature vectors closest in distance to a centroid are suggested to the user together with a cluster id. With a single click, the user can modify the suggested cluster into the appropriate class. The parameter N is configurable over the user interface.

Supervised automatic annotation
The supervised feature searches for portions of data similar to previously added range annotations. When a range annotation is added while the supervised auto-annotation is enabled, the Data Annotation App scans the current file for segments of data of the same length and compares the segments to the range annotation using Dynamic Time Warping. The resulting segments are sorted and the N most similar segments are suggested to the user. The parameter N is configurable over the user interface.

2- Data Analysis
The Data Analysis App displays segments of data grouped by class. This is useful to study the differences across classes to design a recognition algorithm able to discriminate between classes. Segments can be plotted either on top of each other or sequentially (i.e. after each other).

In order to visualize data:
- Select one or more input data files.
- Select where the segments should come from. Manual annotations creates segments from the range annotations and loads event annotations to create segments using the ManualSegmentationStrategy. The Automatic segmentation uses a preprocessing, event detection and segmentation algorithms selected over the user interface to create segments.
- (in Automatic segmentation mode) Select the signals to use, a preprocessing algorithm and (optionally) an event detection algorithm.
- Select a segmentation strategy and (optionally) a grouping strategy. Click the Execute button. At this point the segments are created. A grouping strategy maps annotated labels to classes, usually by grouping different labels into classes.
- Select signals and classes to visualize and a plot style (i.e. overlapping or sequential).
3- Algorithm Implementation
Most wearable device applications execute a sequence of computations to recognize specific patterns based on sensor signals. This sequence of computations is called the Activity Recognition Chain and consists of the following stages:

Programming
Activity recognition applications can be developed directly in Matlab using the WDK's framework of reusable components.
The following text snippet creates a chain of computations and saves it to the goalkeeperChain.mat file. This chain of computations detects events using a peak detector on the squared magnitude of the accelerometer signal, segments the data around the detected events (200 samples to the left of the event and 30 sampels to the right) and extracts the features defined in the goalkeeperFeatureChain.mat file.
%select first three axes of acceleration
axisSelector = AxisSelector(1:3);%AX AY AZ
%compute the magnitude of acceleration
magnitudeSquared = MagnitudeSquared();
%detect peaks on the magnitude of acceleration
simplePeakDetector = SimplePeakDetector();
simplePeakDetector.minPeakHeight = single(0.8);
simplePeakDetector.minPeakDistance = int32(100);
%create segments around detected peaks
eventSegmentation = EventSegmentation();
eventSegmentation.segmentSizeLeft = 200;
eventSegmentation.segmentSizeRight = 30;
%label created segments
labeler = EventSegmentsLabeler();
%load feature extraction algorithm
featureExtractor = DataLoader.LoadComputer('goalkeeperFeatureChain.mat');
%create the recognition algorithm
arcChain = Computer.ComputerWithSequence({FileLoader(),PropertyGetter('data'),...
axisSelector,magnitudeSquared,simplePeakDetector,eventSegmentation,labeler,...
featureExtractor});
%export the recognition algorithm
DataLoader.SaveComputer(arcChain,'goalkeeperChain.mat');
This chain of computations produces a feature table that can be used within the Assessment App to study the performance of different machine learning algorithms.
Visual Programming (optional)
Activity recognition applications can also be developed visually in Node-RED using the nodes available in the WDK-RED platform. The following image shows an activity recognition chain for detecting and classifying soccer goalkeeper training exercises using a wearable motion sensor attached to a glove worn by a goalkeeper:

Activity Recognition Chains can be imported and executed in the WDK as follows:
- Create and export the Activity Recognition Chains as described in the WDK-RED repository.
- Execute the convertJSONToWDK.m script.
- Use the Execute from File button in each App of the WDK.
4- Algorithm Assessment
The development and assessement / evaluation of an activity recognition algorithm usually represents a large fraction of the effort to develop the entire application. The Assessment App enables developers to design algorithms by selecting reusable components at each stage of the activity recognition chain and to assess their performance. The recognition performance metrics provided by this tool are:
- Accuracy
- Precision
- Recall
- Confusion Matrix
and the computational performance metrics are:
- Flops: number of floating point operations performed by the algorithm for the input data set
- Memory: amount of memory consumed by the algorithm (in bytes) for the input data set
- Communication: amount of bytes generated by the last component in the recognition chain
The following image shows the configuration and classification results of an algorithm to detect and classify exercises performed by patients after a hip replacement surgery.
