MultiTouch
A lightweight touch gesture recognition library written in C.
Install / Use
/learn @Russell-Newton/MultiTouchREADME
MultiTouch
<p align="center"> <img src="https://raw.githubusercontent.com/Russell-Newton/MultiTouch/main/images/click-2384-black.svg#gh-light-mode-only"> <img src="https://raw.githubusercontent.com/Russell-Newton/MultiTouch/main/images/click-2384-white.svg#gh-dark-mode-only"> </p> <!-- ALL-CONTRIBUTORS-BADGE:START - Do not remove or modify this section --> <!-- ALL-CONTRIBUTORS-BADGE:END -->A lightweight touch gesture recognition library created in C as a part of Georgia Tech's Spring-Fall 2022 Junior Design program.
Contents
Installation
Prerequisites
-
Install build-essential to have access to
makeandgcc:sudo apt update && sudo apt install build-essential -
Install CMake:
sudo apt-get -y install cmake
ℹ️ Windows development is possible with tools like Chocolatey.
Option 1: Include Source in Your Project
-
Clone the repository into your project.
git clone https://github.com/Russell-Newton/MultiTouch.git <Destination> -
Include the source in your project
- If you use CMake, then in a
CMakeLists.txtof your project, add thegesturelibraryfolder of the repository as a subdirectory usingadd_subdirectory. Delete the section ofgesturelibrary/CMakeLists.txtin theSKIP_TESTSif statement. - If you do not use CMake, include the files in the
gesturelibrary/includefolder and add the files in thegesturelibrary/srcfolder to your executable.
- If you use CMake, then in a
Option 2: Build Static Library and Link to Your Project
-
Clone the repo.
git clone https://github.com/Russell-Newton/MultiTouch.git -
Build the CMake project.
cd MultiTouch cmake -S gesturelibrary -B build -D SKIP_TESTS=true -
Compile the library with
make.cd build make -
Include the library when compiling your program:
* Add `-I...pathto/MultiTouch/gesturelibrary/include` to your compile command. * Add `...pathto/MultiTouch/build/libGestureLibrary.a` to your compile targets.
Troubleshooting
If build errors occur, make sure you have make and cmake installed and added to your path. Ensure that you have a C compiler like gcc.
In Unix, make and gcc can be installed by running:
sudo apt update && sudo apt install build-essential
Other common build issues may be related to where the CMake build directory is located. Make sure you run make from within the directory created by running cmake.
Usage
- Include
<gesturelib.h>and the header files for any estures you are interested in. For example,<tap.h>and<drag.h>. - Adjust the gesture parameters in
<gestureparams.h>to your desired values. The variables can be set at runtime, but will require the gesture library to be reinitialized after modification. - Call
init_gesturelib(). - Create an adapter for your touch input device. Adapters transform device input data into
touch_event_ts. - Whenever a touch is received, create a
touch_event_twith your adapter and send it toprocess_touch_event().- If you want the library to determine which finger this event corresponds to, set
event.group = TOUCH_GROUP_UNDEFINED.
- If you want the library to determine which finger this event corresponds to, set
- Recognized gestures can be obtained from the library synchronously or asynchronously.
-
To synchronously access recognized gestures,
- Call the
get_[gesture]function of the gesture you are interested in. For example,get_tapandget_drag. - This returns an array of gesture structs for the gesture you are interested in. For example,
tap_tanddrag_t. - You can read the data from the array, but if a thread is currently executing the
process_touch_event()function, then the data in the array may change as you are reading it.
- Call the
-
To asynchronously access recognized gestures,
- Create custom listeners or enable/disable built-in listeners with the provided utility functions:
add_recognizer()remove_recognizer()enable_recognizer()disable_recognizer()
- Listeners accept a
const [gesture_t]*and can read the data from the updated gesture. The gesture data will not change until the next invocation ofprocess_touch_event.
- Create custom listeners or enable/disable built-in listeners with the provided utility functions:
Listeners
Listeners are single functions that accept gesture-specific data and have a void return type. They are called whenever a
recognizer's state machine updates its internal state. A listener should be registered after calling
init_gesturelib().
Example:
// main.c
#include
<stdio.h>
#include
<gesturelib.h>
#include
<tap.h>
void tap_listener(const tap_t* event) {
if (event.type == RECOGNIZER_STATE_COMPLETED) {
printf("Tap received at (%.3f, %.3f)!", event.x, event.y);
}
}
int main(int argc, char *argv[]) {
init_gesturelib();
// register the new listener
set_on_tap(tap_listener);
// rest of program
}
Design
Touch Preprocessing
After touch data has been transformed into a touch_event_t and sent to our library, the library will perform some
additional preprocessing. If the event has its group set to TOUCH_ID_UNDEFINED, the library will determine which touch
group it belongs to. If the device provides a touch group, the library will not assign one.
The touch group represents the finger a touch event was made by. That is, touch group 0 corresponds to events created by the first finger pressed, 1 to the second, 2 to the third, and so on.
Touch group assignment is determined by event type:
- If the event is a down event, attempt to assign it to the first unused group. Track this event as the most recent event in the group it was assigned to, marking the group as active. If there are no unassigned groups, leave the group as unassigned.
- If the event is a move event, find the active group this event is closest to. Assign it to that group and track this event as the most recent in the group. If there are no active groups, leave it unassigned.
- If the event is an up event, perform the same logic as with a move event. This time when a group is assigned, the group is marked as inactive.
ℹ️ Group assignment ensures that fingers generate the same group as long as they're in contact with the touch device.
After the preprocessing has finished, a touch event is sent to every enabled recognizer in the order in which they were added to the library.
Recognizers
Gesture recognizers are built like state machines. They receive touch events and update their state. When the state is updated, they call on the registered event listener, if applicable.
Builtin single-finger gesture recognizers save data about every possible touch group that could be performing the gesture they recognize.
Builtin multi-finger recognizers are more complicated and store data about every possible group for every possible user id. User id is set by the data adapter and could be determined by factors like which device received the touch or where on the screen the touch was received.
⚠️ All touch events with the same uid will be considered as part of the same multi-finger gesture for recognition purposes.
Gestures
Gesture recognition starts with a base gesture: stroke. Any other gestures can be recognized by composing and performing additional processing on strokes and other composite gestures.
Stroke
Stroke is a simple gesture with a simple state machine:
<p align="center"> <img src="https://raw.githubusercontent.com/Russell-Newton/MultiTouch/main/images/stroke-sm-black.svg#gh-light-mode-only"> <img src="https://raw.githubusercontent.com/Russell-Newton/MultiTouch/main/images/stroke-sm-white.svg#gh-dark-mode-only"> </p>The state updates are less important than the data that stroke collects. Stroke collects data on:
- Initial down event position and time
- Current move/up event position and time
- Move event speed (as a moving average with configurable window size)
- Touch group and user
When creating more complicated gestures, having access to this data can be incredibly useful.
Multistroke
Multistroke is a multi-finger counterpart to stroke. All strokes with the same user id get grouped into the same multistroke. The first down event starts a multistroke, and the last up event for the user id ends the gestu
Related Skills
node-connect
334.9kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
82.3kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
334.9kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
commit-push-pr
82.3kCommit, push, and open a PR
