Nudged
Estimate scale, rotation, and translation between two sets of 2D points. For multitouch, calibration, pattern recognition, and more.
Install / Use
/learn @axelpale/NudgedREADME
nudged

Nudged is a JavaScript module to efficiently estimate translation, scale, and rotation between two sets of 2D points. It enables you to capture transformations that you can use for motion dynamics, calibration, geometry snapping, and mapping between coordinate spaces. It has already been applied to user interface geometry [1], multi-touch recognition [1], and eye tracker calibration [2].
Table of contents
- Installation
- Introduction
- Usage
- Example apps
- API docs
- For developers
- Acknowledgments
- Versioning
- Licence
- See also
- GitHub
Installation
Install nudged with npm, yarn or other compatible package manager. The package comes in two flavors: functional and object oriented.
Install the latest, functional nudged:
$ npm install nudged
Alternatively, install the object oriented nudged 1.x:
$ npm install nudged@1
A standalone UMD bundle is available via Unpkg CDN for nudged@2.1.0 and later:
<script src="https://www.unpkg.com/nudged/dist/nudged.min.js"></script>
Nudged is also available in Python.
Introduction
In general, you can apply Nudged in any situation where you want to capture a 2D transformation based on a movement of any number of control points. You have a set of points, and they move from some source coordinates to some target coordinates. You want to capture the movement pattern between the source and the target as a 2D transformation. You may want to capture the pattern in order to apply it to something else, such as a photo or some other object. See the image below for the available transformations Nudged can estimate, illustrated with two control points and a photo.
<img src="doc/transformation-types.jpg" alt="Types of transformation estimators"/><br> Image: Available transformation estimators. Each estimator has an abbreviated name, for example 'SR', according to the free parameters to estimate. The black-white dots and connecting arrows represent movement of two control points. Given the control points, Nudged estimates a transformation. The pairs of photos represent the effect of the resulting transformation. For easy visual comparison, the control points and the initial image positions are kept the same for each estimator.
Mathematically speaking, Nudged is a set of optimal least squares estimators for the group of nonreflective similarity transformation matrices, also called Helmert transformations. Such transformations are affine transformations with translation, rotation, and/or uniform scaling, and without reflection or shearing. The estimation has time complexity of O(n), where n is the cardinality (size) of the point sets. In other words, Nudged solves a 2D to 2D point set registration problem (alias Procrustes superimposition) in linear time. The algorithms and their efficiency are thoroughly described in a M.Sc. thesis Advanced algorithms for manipulating 2D objects on touch screens.
The development has been supported by Infant Cognition Laboratory at Tampere University where Nudged is used to correct eye tracking data. Yet, the main motivation for Nudged comes from Tapspace.js, a zoomable user interface library where smooth and fast scaling by touch is crucial.
Usage
Let domain be a set of points, [{ x, y }, ...]. Let range be the same points after an unknown transformation T as illustrated in the figure below.
const domain = [{ x: 0, y: 2 }, { x: 2, y: 2 }, { x: 1, y: 4 }]
const range = [{ x: 4, y: 4 }, { x: 4, y: 2 }, { x: 6, y: 3 }]
<img src="doc/img/nudged-diagram-6-4-domain-range.png" alt="The domain and the range" /><br> Figure: The domain (circles o) and the range (crosses x). The + marks the point {x:0,y:0}.
We would like to find a simple 2D transformation tran that simulates T as closely as possible by combining translation, scaling, and rotation. We compute tran by calling nudged.estimate:
const tran = nudged.estimate({
estimator: 'TSR',
domain: domain,
range: range
})
The result is a transform object:
> tran
{ a: 0, b: -1, x: 4, y: 4 }
You can apply tran to a point with point.transform:
> nudged.point.transform({ x: 0, y: 4 }, tran)
{ x: 6, y: 4 }
<img src="doc/img/nudged-diagram-6-7-point-transform.png" alt="A point is being transformed" /><br> Figure: A point {x:0, y:4} is transformed by the estimated transform.
You can apply tran to other geometric shapes as well, for example to correct the orientation based on some sensor data. In the case of HTML image elements, just convert tran to a CSS transform string with transform.toString:
> img.style.transform = nudged.transform.toString(tran)
<img src="doc/img/nudged-diagram-10-2-photo-transform.jpg" alt="A photograph is being transformed" /><br> Figure: An HTML image before and after the transform we estimated from the points.
The nudged.transform module provides lots of tools to process transform objects. For example, to make a transformation that maps the range back to the domain instead of another way around, invert the transform with transform.inverse:
> const inv = nudged.transform.inverse(tran)
> nudged.point.transform({ x: 6, y: 4 }, inv)
{ x: 0, y: 4 }
<img src="doc/img/nudged-diagram-6-8-transform-inverse.png" alt="A point transformed by the inverse of the estimate." /><br> Figure: A point is transformed by the inverse of the estimated transform.
See nudged.transform for more tools and details.
Set a center point
To estimate scalings and rotations around a fixed point, give an additional center parameter. Only the estimators S, R, and SR respect the center parameter.
const center = { x: 4 , y: 0 }
const rotateAround = nudged.estimate({
estimator: 'R',
domain: domain,
range: range,
center: center
})
You can think the center point as a nail that keeps an elastic sheet of rubber fixed onto a table. The nail retains its location regardless of how the rubber sheet is rotated or stretched around it.
<img src="doc/img/nudged-diagram-7-4-rotation-around-center.png" alt="A rotation around a fixed center point" /><br> Figure: Rotation around a center point (⊕) maps the domain (o) as close to the range (x) as possible. Here the mapped image (●) cannot match the range exactly due to the restriction set by the center point. The + denotes the point {x:0, y:0}.
To test the resulting transform, we can apply it to the center point and observe that the point stays the same.
> nudged.point.transform(center, rotateAround)
{ x: 4, y: 0 }
To estimate scalings in respect of a center point, as illustrated below, set estimators: 'S'. This scaling operation is also called a homothety.
const s = nudged.estimate({
estimator: 'S',
domain: domain,
range: range,
center: center
})
<img src="doc/img/nudged-diagram-8-8-scaling-estimation.png" alt="Scaling about a center point (⊕)" /><br> Figure: The domain (o) is scaled towards the center point (⊕) so that the resulting image (●) lies as close to the range (x) as possible.
See estimators.S, estimators.R, and estimators.SR for further details.
Analyse the transform
To examine properties of the resulting transformation matrix:
> nudged.transform.getRotation(tran)
-1.5707... = -π / 2
> nudged.transform.getScale(tran)
1.0
> nudged.transform.getTranslation(tran)
{ x: 2, y: 4 }
> nudged.transform.toMatrix(tran)
{ a: 0, c: 1, e: 2,
b: -1, d: 0, f: 4 }
To compare how well the transform fits the domain to the range, you can comp
