SkillAgentSearch skills...

Maroon

Official code repository accompanying the paper "MAROON: A Dataset for the Joint Characterization of Near-Field High-Resolution Radio-Frequency and Optical Depth Imaging Techniques"

Install / Use

/learn @vwirth/Maroon
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

maroon language [![CC BY-NC 4.0][cc-by-nc-shield]][cc-by-nc]

[![CC BY-NC 4.0][cc-by-nc-image]][cc-by-nc]

Project Page 🌐 | Paper 🗒️ | Arxiv 📚 | Dataset Full (~200GB) 📁 | Dataset Mini (~1GB) 📁 | Data Live Viewer 🌐

🌰 MAROON Documentation

This is the official code repository accompanying the paper MAROON.

This repository includes a python-based dataset viewer and provides data preprocessing code for all sensor measurements of:

  • Microsoft Kinect Azure
  • Intel Realsense D435i
  • Stereolabs Zed X Mini
  • Rohde & Schwarz's QAR50 radar submodule
  • Groundtruth Multi View Stereo reconstructions with Agisoft Metashape

Furthermore, it includes the reconstruction code for the MIMO imaging radar measurements.


Updates

  • [January 15, 2026] We have added synthetic ground truth for some of the objects. These can be found for each object within photogrammetry/mesh_synthetic.obj, respectively.
  • [January 15, 2026] We have added an additional capture object: 47_bunny_box_centered
  • [January 15, 2026] 📣 The dataset is now available!

<img src="assets/viewer.gif" height="555"><img src="assets/viewmode.gif" height="555">

Table of Contents

Dataset

  • A preview of the dataset can be found at: https://maroon-dataset.github.io/
  • A mini version of the data for one object is available at: https://faubox.rrze.uni-erlangen.de/getlink/fi43P9pBvMVCGz5xJSfRRM/maroon_mini.zip
  • The full dataset is available on Zenodo: https://zenodo.org/records/18254440. It is split into several .zip files:
    • [Required] 0_maroon_v2_meta.zip: Contains the metadata and calibration files about labeled objects.
    • [On Demand of Sensor] 1_maroon_v2_radar_raw.zip: Contains the raw radar measurements, on which reconstruction can be performed by using this repository
    • [Optional] 1_maroon_radar_cached_1.zip, 1_maroon_radar_cached_2.zip, 1_maroon_radar_cached_3.zip: Contains the cached reconstructions. It is useful to download these in case you do not have a powerful GPU or don't want to wait for reconstruction to be performed first.
    • [On Demand of Sensor] 2_maroon_v2_kinect.zip : Contains the optical time-of-flight measurements (RGB, Infrared, Depth) of Microsoft's Kinect Azure Camera, with mask/segmentation annotations
    • [On Demand of Sensor] 3_maroon_v2_realsense.zip : Contains the optical active stereo measurements (RGB, Depth) of the Intel Realsense D435i, with mask/segmentation annotations
    • [On Demand of Sensor] 4_maroon_v2_zed.zip : Contains the optical passive stereo measurements (RGB, Depth) of Stereolabs' ZED camera, with mask/segmentation annotations
    • [On Demand of Sensor] 5_maroon_v2_mvs.zip: Contains the ground-truth measurements obtained from a Multi View Stereo (MVS) system of 5 DSLR cameras, with mask/segmentation annotations
    • [Optional] 6_maroon_v2_extra.zip: Contains the calibration measurements, and additional measurements of the empty measurement room, without any object placed in fromt.

As the Zenodo website can experience website timeouts, it is recommended to download the data via the Python package zenodo-get:

pip install zenodo-get

The data can be downloaded by executing:

zenodo_get 18254440 -R 10

or for each .zip file individually:

zenodo_get 18254440 -R 10 -g "0_maroon_v2_meta.zip"

Dependencies

  • All basic dependencies are listed in code/setup.py. To install the maroon package, run:
cd code;
python3 -m pip install setup.py 

[Optional]: To visualize all sensor data, the installation requires several sensor-specific packages:

  • pykinect_azure (Microsoft Azure Kinect)
  • pyzed (Stereolabs ZED)
  • pyrealsense2 (Intel Realsense) If you only want to visualize photogrammetry and radar data, you can skip installation of these dependencies. Make sure to adjust the sensors_in_use option in your configuration file in this case (see configuration) Further Installation instructions about these additional packages is provided below.

Pykinect Azure (Optional)

If you wish to include the Microsoft Kinect, you need to download the Microsoft Kinect Azure SDK with version 1.3.0. Installation instructions for Linux and Windows are provided here - However, they might not work for your current Linux system (see instructions below)

As the the installation via the package manager system is discontinued since Ubuntu 18.04, it is advisable to install the SDK from scratch. Therefore, make sure to have the following components installed

sudo apt-get ninja-build

Then, install version 1.3.0 of the SDK via:

git clone https://github.com/microsoft/Azure-Kinect-Sensor-SDK.git
cd Azure-Kinect-Sensor-SDK
git checkout v1.3.0
mkdir build && cd build
cmake .. -GNinja
ninja
sudo ninja install

After that, you should have two libraries: libk4a and libk4a-dev

To have a Python wrapper for the kinect SDK, this repository adapts code from Ibai Gorordo's pyKinectAzure, which is located in external/pykinect_azure. To install this package, run:

cd external/pykinect_azure;
python -m pip install setup.py 

Pyzed (4.2.0) (Optional)

Installation instructions are explained here

Install the following packages

python -m pip install cython numpy opencv-python pyopengl

To install the ZED python package, you need to download the ZED SDK here. If this is causing any problems, please not that the currently tested version for this setup is 4.2.0. After installation, you can get the python package pyzed by invoking the respective installation script:

python get_python_api.py

In the installation directory.

Pyrealsense2 (Optional)

The package can be simply installed with

python -m pip install pyrealsense2

Dataset Structure

Overview

The MAROON dataset contains the following structure:

maroon
|
|--> <objectname1>
    # object-to-sensor distance at 30 cm (from radar's perspective)
    |--> 30
        |--> metadata.json
        |--> alignment.json
        |--> radar_72.0_82.0_128
        |--> photogrammetry
        |--> kinect
        |--> zed
        |--> realsense
    # object-to-sensor distance at 40 cm (from radar's perspective)
    |--> 40
    # object-to-sensor distance at 50 cm (from radar's perspective)
    |--> 50
|--> <objectname2>
|--> ...

The metadata.json file contains metadata about the object capture such as:

{
    # name of the coarse calibration object that is located in the `calibration_01` directory
    "calibration": "01_registration_coarse",
    # name of the fine calibration object that is located in the `calibration_01` directory
    "calibration_fine": "01_registration_fine",
    # approximate distance of the object to the radar sensor 
    "distance_meters": 0.3,
    # mask erosion kernel size that is used in the paper to evaluate P_e
    "mask_erosion": 20,
    # object labels that were used in GroundedSAM for semi-automatic object segmentation
    "labels": ["sigma"],
    # bounding box parameters (in radar space, in meters) that limit the spatial extents, in which the depth deviation metrics are calculated
    "mask_bb": {"zmin": 0.23, "zmax": 0.1},
    # name of the 'empty' measurement (without any target) of the room setup that was done before capturing the object in front (see explanation for radar data below)
    "empty_measurement": "01_empty",
}

The alignment.json file contains intermediate data about the calibration procedure as well as the calibration results, which are transformation matrices between sensor spaces:

{
    # 4x4 matrix in row-first order that transforms from kinect space to photogrammetry space
    "kinect2photogrammetry": [0,0,0,0], [0,0,0,0], [0,0,0,0], [0,0,0,1],
    # transforms from kinect -> radar
    "kinect2radar": [0,0,0,0], [0,0,0,0], [0,0,0,0], [0,0,0,1],
    # transforms from the original sensor space to an intermediate, so-called 'world space',  which is the same for all sensors and has the following coordinates:
    # Y
    # ^
    # |   z
    # |  /
    # | /
    # \------> X
    "kinect2world": [0,0,0,0], [0,0,0,0], [0,0,0,0], [0,0,0,1],
    # some intermediate data that was used for calibration
    "kinect_calib" : {} 
    
    ...

    "photogrammetry2kinect": ...
    "photogrammetry2realsense": ...
    "photogrammetry2zed" : ...
    "photogrammetry2radar" : ...
    "photogrammetry2world": ...
    "photo_calib": ...

    ... analogous for all other sensors
}

Radar

The radar directory structure looks like this:

|--> radar_72.0_82.0_128
    # stores metadata about the reconstruction process
    |--> calibration.json
    # contains, for each frame, the raw radar data, which is a tensor of 94x94x128 complex numbers
    # the additional 'emptyfiltered' versions contain data that was pre-calibrated to filter
    # out systematic noise patterns that are present due to the furniture being present in
    # the measuremen

Related Skills

View on GitHub
GitHub Stars13
CategoryDevelopment
Updated1mo ago
Forks1

Languages

Python

Security Score

75/100

Audited on Feb 2, 2026

No findings