SkillAgentSearch skills...

MultiTouch

A lightweight touch gesture recognition library written in C.

Install / Use

/learn @Russell-Newton/MultiTouch
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

MultiTouch

<p align="center"> <img src="https://raw.githubusercontent.com/Russell-Newton/MultiTouch/main/images/click-2384-black.svg#gh-light-mode-only"> <img src="https://raw.githubusercontent.com/Russell-Newton/MultiTouch/main/images/click-2384-white.svg#gh-dark-mode-only"> </p>

Built with CMake Demo Powered by Emscripten Docs generated by Doxygen

Deploy Artifacts to GitHub Pages Checks: pre-commit Unit Tests

<!-- ALL-CONTRIBUTORS-BADGE:START - Do not remove or modify this section -->

All Contributors

<!-- ALL-CONTRIBUTORS-BADGE:END -->

A lightweight touch gesture recognition library created in C as a part of Georgia Tech's Spring-Fall 2022 Junior Design program.

See the Demo!


Contents


Installation

Prerequisites

  1. Install build-essential to have access to make and gcc:

    sudo apt update && sudo apt install build-essential
    
  2. Install CMake:

    sudo apt-get -y install cmake
    

ℹ️ Windows development is possible with tools like Chocolatey.

Option 1: Include Source in Your Project

  1. Clone the repository into your project.

    git clone https://github.com/Russell-Newton/MultiTouch.git <Destination>
    
  2. Include the source in your project

    • If you use CMake, then in a CMakeLists.txt of your project, add the gesturelibrary folder of the repository as a subdirectory using add_subdirectory. Delete the section of gesturelibrary/CMakeLists.txt in the SKIP_TESTS if statement.
    • If you do not use CMake, include the files in the gesturelibrary/include folder and add the files in the gesturelibrary/src folder to your executable.

Option 2: Build Static Library and Link to Your Project

  1. Clone the repo.

    git clone https://github.com/Russell-Newton/MultiTouch.git
    
  2. Build the CMake project.

    cd MultiTouch
    cmake -S gesturelibrary -B build -D SKIP_TESTS=true
    
  3. Compile the library with make.

    cd build
    make
    
  4. Include the library when compiling your program:

    * Add `-I...pathto/MultiTouch/gesturelibrary/include` to your compile command.
    * Add `...pathto/MultiTouch/build/libGestureLibrary.a` to your compile targets.
    

Troubleshooting

If build errors occur, make sure you have make and cmake installed and added to your path. Ensure that you have a C compiler like gcc. In Unix, make and gcc can be installed by running:

sudo apt update && sudo apt install build-essential

Other common build issues may be related to where the CMake build directory is located. Make sure you run make from within the directory created by running cmake.


Usage

  1. Include <gesturelib.h> and the header files for any estures you are interested in. For example, <tap.h> and <drag.h>.
  2. Adjust the gesture parameters in <gestureparams.h> to your desired values. The variables can be set at runtime, but will require the gesture library to be reinitialized after modification.
  3. Call init_gesturelib().
  4. Create an adapter for your touch input device. Adapters transform device input data into touch_event_ts.
  5. Whenever a touch is received, create a touch_event_t with your adapter and send it to process_touch_event().
    • If you want the library to determine which finger this event corresponds to, set event.group = TOUCH_GROUP_UNDEFINED.
  6. Recognized gestures can be obtained from the library synchronously or asynchronously.
  • To synchronously access recognized gestures,

    1. Call the get_[gesture] function of the gesture you are interested in. For example, get_tap and get_drag.
    2. This returns an array of gesture structs for the gesture you are interested in. For example, tap_t and drag_t.
    3. You can read the data from the array, but if a thread is currently executing the process_touch_event() function, then the data in the array may change as you are reading it.
  • To asynchronously access recognized gestures,

    1. Create custom listeners or enable/disable built-in listeners with the provided utility functions:
      • add_recognizer()
      • remove_recognizer()
      • enable_recognizer()
      • disable_recognizer()
    2. Listeners accept a const [gesture_t]* and can read the data from the updated gesture. The gesture data will not change until the next invocation of process_touch_event.

Listeners

Listeners are single functions that accept gesture-specific data and have a void return type. They are called whenever a recognizer's state machine updates its internal state. A listener should be registered after calling init_gesturelib().

Example:

// main.c
#include
<stdio.h>
#include
<gesturelib.h>
#include
<tap.h>

void tap_listener(const tap_t* event) {
if (event.type == RECOGNIZER_STATE_COMPLETED) {
printf("Tap received at (%.3f, %.3f)!", event.x, event.y);
}
}

int main(int argc, char *argv[]) {
init_gesturelib();

// register the new listener
set_on_tap(tap_listener);

// rest of program
}

Design

Touch Preprocessing

After touch data has been transformed into a touch_event_t and sent to our library, the library will perform some additional preprocessing. If the event has its group set to TOUCH_ID_UNDEFINED, the library will determine which touch group it belongs to. If the device provides a touch group, the library will not assign one.

The touch group represents the finger a touch event was made by. That is, touch group 0 corresponds to events created by the first finger pressed, 1 to the second, 2 to the third, and so on.

Touch group assignment is determined by event type:

  • If the event is a down event, attempt to assign it to the first unused group. Track this event as the most recent event in the group it was assigned to, marking the group as active. If there are no unassigned groups, leave the group as unassigned.
  • If the event is a move event, find the active group this event is closest to. Assign it to that group and track this event as the most recent in the group. If there are no active groups, leave it unassigned.
  • If the event is an up event, perform the same logic as with a move event. This time when a group is assigned, the group is marked as inactive.

ℹ️ Group assignment ensures that fingers generate the same group as long as they're in contact with the touch device.

After the preprocessing has finished, a touch event is sent to every enabled recognizer in the order in which they were added to the library.

Recognizers

Gesture recognizers are built like state machines. They receive touch events and update their state. When the state is updated, they call on the registered event listener, if applicable.

Builtin single-finger gesture recognizers save data about every possible touch group that could be performing the gesture they recognize.

Builtin multi-finger recognizers are more complicated and store data about every possible group for every possible user id. User id is set by the data adapter and could be determined by factors like which device received the touch or where on the screen the touch was received.

⚠️ All touch events with the same uid will be considered as part of the same multi-finger gesture for recognition purposes.

Gestures

Gesture recognition starts with a base gesture: stroke. Any other gestures can be recognized by composing and performing additional processing on strokes and other composite gestures.

Stroke

Stroke is a simple gesture with a simple state machine:

<p align="center"> <img src="https://raw.githubusercontent.com/Russell-Newton/MultiTouch/main/images/stroke-sm-black.svg#gh-light-mode-only"> <img src="https://raw.githubusercontent.com/Russell-Newton/MultiTouch/main/images/stroke-sm-white.svg#gh-dark-mode-only"> </p>

The state updates are less important than the data that stroke collects. Stroke collects data on:

  • Initial down event position and time
  • Current move/up event position and time
  • Move event speed (as a moving average with configurable window size)
  • Touch group and user

When creating more complicated gestures, having access to this data can be incredibly useful.

Multistroke

Multistroke is a multi-finger counterpart to stroke. All strokes with the same user id get grouped into the same multistroke. The first down event starts a multistroke, and the last up event for the user id ends the gestu

Related Skills

View on GitHub
GitHub Stars22
CategoryDevelopment
Updated2mo ago
Forks1

Languages

C

Security Score

95/100

Audited on Jan 22, 2026

No findings