SkillAgentSearch skills...

SARndbox

Augmented reality application scanning a sand surface using a Kinect 3D camera, and projecting a real- time updated topography map with topographic contour lines, hillshading, and an optional real-time water flow simulation back onto the sand surface using a calibrated projector

Install / Use

/learn @KeckCAVES/SARndbox
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

======================================================================== README for Augmented Reality Sandbox (SARndbox) version 2.6 Copyright (c) 2012-2018 Oliver Kreylos

Overview

The Augmented Reality Sandbox is an augmented reality application scanning a sand surface using a Kinect 3D camera, and projecting a real- time updated topography map with topographic contour lines, hillshading, and an optional real-time water flow simulation back onto the sand surface using a calibrated projector.

Requirements

The Augmented Reality Sandbox requires Vrui version 4.6 build 005 or newer, and the Kinect 3D Video Capture Project version 3.7 or newer.

Installation Guide

It is recommended to download or move the source packages for Vrui, the Kinect 3D Video Capture Project, and the Augmented Reality Sandbox into a src directory underneath the user's home directory. Otherwise, references to ~/src in the following instructions need to be changed.

  1. Install Vrui from ~/src/Vrui-<version>-<build> (see Vrui README file).

0.5 Install the Kinect 3D Video Capture Project from ~/src/Kinect-<version> (see the Kinect 3D Video Capture Project README file).

  1. Change into ~/src directory and unpack the Augmented Reality Sandbox tarball:

    cd ~/src tar xfz <download path>/SARndbox-<version>.tar.gz

    • or -

    tar xf <download path>/SARndbox-<version>.tar

  2. Change into the Augmented Reality Sandbox's base directory:

    cd SARndbox-<version>

  3. If the Vrui version installed in step 0 was not 4.6, or Vrui's installation directory was changed from the default of /usr/local, adapt the makefile using a text editor. Change the value of VRUI_MAKEDIR close to the beginning of the file as follows: VRUI_MAKEDIR := <Vrui install dir>/share/make Where <Vrui install dir> is the installation directory chosen in step 0. Use $(HOME) to refer to the user's home directory instead of ~.

  4. Build the Augmented Reality Sandbox:

    make

  5. Install the Augmented Reality Sandbox in the selected target location. To install:

    sudo make install This will copy all executables into <INSTALLDIR>/bin, all configuration files into <INSTALLDIR>/etc/SARndbox-<version>, all resource files into <INSTALLDIR>/share/SARndbox-<version>, and all shader source codes into <INSTALLDIR>/share/SARndbox-<version>/Shaders.

Use

The Augmented Reality Sandbox package contains the sandbox application itself, SARndbox, and a calibration utility to interactively measure a transformation between the Kinect camera scanning the sandbox surface, and the projector projecting onto it. The setup procedure described below also uses several utilities from the Kinect 3D video capture project.

Setup and Calibration

Before the Augmented Reality Sandbox can be used, the hardware (physical sandbox, Kinect camera, and projector) has to be set up properly, and the various components have to be calibrated internally and with respect to each other. While the sandbox can be run in "trial mode" with very little required setup, for the full effect the following steps have to be performed in order:

  1. (Optional) Calculate per-pixel depth correction coefficients for the Kinect camera.

  2. (Optional) Internally calibrate the Kinect camera. We strongly recommend skipping this step on initial installation, and only performing it if there are intolerable offsets between the real sand surface in the AR Sandbox and the projected topographic image.

  3. Mount the Kinect camera above the sandbox so that it is looking straight down, and can see the entire sand surface. Use RawKinectViewer from the Kinect 3D video capture project to line up the depth camera while ignoring the color camera.

  4. Measure the base plane equation of the sand surface relative to the Kinect camera's internal coordinate system using RawKinectViewer's plane extraction tool. (See "Using Vrui Applications" in the Vrui HTML documentation on how to use RawKinectViewer, and particularly on how to create / destroy tools.)

  5. Measure the extents of the sand surface relative to the Kinect camera's internal coordinate system using KinectViewer and a 3D measurement tool.

  6. Mount the projector above the sand surface so that it projects its image perpendicularly onto the flattened sand surface, and so that the projector's field-of-projection and the Kinect camera's field-of- view overlap as much as possible. Focus the projector to the flattened average-height sand surface.

  7. Calculate a calibration matrix from the Kinect camera's camera space to projector space using the CalibrateProjector utility and a circular calibration target (a CD with a fitting white paper disk glued to one surface).

  8. Test the setup by running the Augmented Reality Sandbox application.

Step 1: Per-pixel depth correction

Kinect cameras have non-linear distortions in their depth measurements due to uncorrected lens distortions in the depth camera. The Kinect 3D video capture project has a calibration tool to gather per-pixel correction factors to "straighten out" the depth image.

To calculate depth correction coefficients, start the RawKinectViewer utility and create a "Calibrate Depth Lens" tool. (See "Using Vrui Applications" in the Vrui HTML documentation on how to create tools.) Then find a completely flat surface, and point the Kinect camera perpendicularly at that surface from a variety of distances. Ensure that the depth camera only sees the flat surface and no other objects, and that there are no holes in the depth images.

Then capture one depth correction tie point for each distance between the Kinect camera and the flat surface:

  1. Line up the Kinect camera.

  2. Capture an average depth frame by selecting the "Average Frames" main menu item, and wait until a static depth frame is displayed.

  3. Create a tie point by pressing the first button bound to the "Calibrate Depth Lens" tool.

  4. De-select the "Average Frames" main menu item, and repeat from step 1 until the surface has been captured from sufficiently many distances.

After all tie points have been collected:

  1. Press the second button bound to the "Calibrate Depth Lens" tool to calculate the per-pixel depth correction factors based on the collected tie points. This will write a depth correction file to the Kinect 3D video capture project's configuration directory, and print a status message to the terminal.

Step 2: Internally calibrate the Kinect camera

Individual Kinect cameras have slightly different internal layouts and slightly different optical properties, meaning that their internal calibrations, i.e., the projection matrices defining how to project depth images back out into 3D space, and how to project color images onto those reprojected depth images, differ individually as well. While all Kinects are factory-calibrated and contain the necessary calibration data in their firmware, the format of those data is proprietary and cannot be read by the Kinect 3D video capture project software, meaning that each Kinect camera has to be calibrated internally before it can be used. In practice, the differences are small, and a Kinect camera can be used without internal calibration by assigning default calibration values, but it is strongly recommended to perform calibration on each device individually.

The internal calibration procedure requires a semi-transparent calibration target; precisely, a checkerboard with alternating clear and opaque tiles. Such a target can be constructed by gluing a large sheet of paper to a clear glass plate, drawing or ideally printing a checkerboard onto it, and cutting out all "odd" tiles using large rulers and a sharp knife. It is important that the tiles are lined up precisely and have precise sizes, and that the clear tiles are completely clean without any dust, specks, or fingerprints. Calibration targets can have a range of sizes and numbers of tiles, but we found the ideal target to contain 7x5 tiles of 3.5"x3.5" each.

Given an appropriate calibration target, the calibration process is performed using RawKinectViewer and its "Draw Grids" tool. The procedure is to show the calibration target to the Kinect camera from a variety of angles and distances, and to capture a calibration tie point for each viewpoint by fitting a grid to the target's images in the depth and color streams interactively.

The detailed procedure is:

  1. Aim Kinect camera at calibration target from a certain position and angle. It is important to include several views where the calibration target is seen at an angle.

  2. Capture an average depth frame by selecting the "Average Frames" main menu item, and wait until a static depth frame is displayed.

  3. Drag the virtual grids displayed in the depth and color frames using the "Draw Grid" tool's first button until the virtual grids exactly match the calibration target. Matching the target in the depth frame is relatively tricky due to the inherent fuzziness of the Kinect's depth camera. Doing this properly will probably take some practice. The important idea is to get a "best fit" between the calibration target and the grid. For the particular purpose of the Augmented Reality Sandbox, the color frame grids can be completely ignored because only the depth camera is used; however, since calibration files are shared between all uses of the Kinect 3D video capture project, it is best to perform a full, depth and color, calibration.

  4. Press the "Draw Grid" tool's second button to store the just-created calibration tie point.

  5. Deselect the "Average Frames" main menu entry, and repeat from step 1 until a sufficient number

Related Skills

View on GitHub
GitHub Stars249
CategoryDevelopment
Updated7d ago
Forks95

Languages

C++

Security Score

95/100

Audited on Mar 17, 2026

No findings