SkillAgentSearch skills...

OpenHI

Self‑calibrated neuromorphic hyperspectral imaging pipeline for event cameras with diffractive illumination. Includes end‑to‑end tools for RAW segmentation, multi‑window time‑warping compensation, spectral visualization, and hardware control for synchronized event/frame capture and scanning.

Install / Use

/learn @lachlanchen/OpenHI

README

English · العربية · Español · Français · 日本語 · 한국어 · Tiếng Việt · 中文 (简体) · 中文(繁體) · Deutsch · Русский

Self-Calibrated Neuromorphic Hyperspectral Imaging (OpenHI)

License: MIT Python Status Sponsor Hardware GUI Paper i18n Pipeline

[!NOTE] i18n status in this checkout: ar, es, fr, ja, ko are present under i18n/. Additional language links are kept for compatibility with planned translation coverage.

A comprehensive pipeline for reconstructing spectra from event cameras with dispersed light illumination (e.g., diffraction grating). The system records intensity change events $e = (x, y, t, p)$ where $p \in {-1, +1}$ indicates polarity of log-intensity change, and automatically infers scan timing and calibration metadata ("auto info") directly from the event stream.

At a Glance

| Item | Details | |---|---| | Core idea | Self-calibrated hyperspectral derivative imaging from event streams | | Main stages | segment_robust_fixed.py -> compensate_multiwindow_train_saved_params.py -> visualization scripts | | Hardware docs in repo | 3D/, PCB/, firmware/, BOM/ | | Desktop tools | scan_compensation_gui_cloud.py, ImagingGUI/DualCamera_separate_transform.py | | Canonical paper | Optica Open preprint (DOI: 10.1364/opticaopen.30739151) | | i18n in this checkout | README.ar.md, README.es.md, README.fr.md, README.ja.md, README.ko.md |

<p align="center"> <img src="images/device_setup.png" alt="Device setup" width="24%"> <img src="images/data_acquisition_gui.png" alt="Acquisition GUI" width="74%"> </p>

Left: modular transmission microscope with a motorised grating illumination arm and vertical detection stack. Right: data-acquisition GUI used to monitor segmentation, compensation, and reconstructions in real time.

[!TIP] Purchase the core development kit (excluding camera, tube lens, and optical table) for the paper Self-calibrated neuromorphic hyperspectral imaging preprinted on Optica Open:

  • https://lazying.art/openhi-kit.html
  • Promotion code for 30% off: OPTICA

Contents

Overview

When illumination sweeps across wavelengths over time, the event stream encodes a temporal derivative of the underlying spectrum along the dispersion axis.

RAW event recording
   -> scan timing segmentation (F/B passes)
   -> multi-window time-warp compensation
   -> frame/cumulative/wavelength diagnostics

This pipeline provides three main stages:

| Stage | Purpose | Primary script(s) | |---|---|---| | 1. Segment | Find scan timing and split recordings into forward/backward passes | segment_robust_fixed.py | | 2. Compensate | Estimate piecewise-linear time-warp to remove scan-induced temporal tilt | compensate_multiwindow_train_saved_params.py | | 3. Visualize | Overlay learned boundaries and compare original vs. compensated time-binned frames | visualize_boundaries_and_frames.py, visualize_cumulative_compare.py |

The repository also includes hardware assets, acquisition GUI code, and archival experiment branches under versions/.

Features

  • End-to-end RAW-to-spectrum event processing workflow.
  • Auto/manual scan period detection and forward/backward segmentation.
  • Multi-window compensation with trainable/fixed parameter modes.
  • Parameter save/load in NPZ, JSON, and CSV.
  • Multi-scan merge workflow for faster training iterations (compensate_multiwindow_turbo.py).
  • Visualization suite for boundaries, binned frames, cumulative curves, and weighted diagnostics.
  • Hardware documentation: BOM, PCB, 3D parts, firmware notes.
  • Acquisition utilities for synchronized event/frame camera setups.

| Category | Included capabilities | |---|---| | Signal processing | Segmentation, period detection, time-warp compensation | | Optimization | Trainable/fixed parameters, smoothness controls, chunked training | | Outputs | Visual overlays, cumulative comparisons, wavelength-mapped diagnostics | | Platform assets | Hardware design files, firmware notes, GUI tooling, historical archives |

Repository Map

Key hardware assets are kept alongside the code for quick access:

| Area | Path | |---|---| | 3D-printed parts | 3D/ | | PCB layouts | PCB/ | | Microcontroller firmware | firmware/ | | Acquisition UI (desktop) | ImagingGUI/ | | Experiment/data references | comparisons/reference_spectrum_2835/, comparisons/reference_spectrum_lumileds/, references/ | | Alignment analysis | comparisons/align_background_vs_reference_code/, comparisons/alignment_configs/ |

Project Structure

OpenHI/
├── README.md
├── QUICKSTART.md
├── LICENSE
├── versions.md
├── 3D/
├── BOM/
├── PCB/
├── firmware/
├── ImagingGUI/
├── scripts/
├── segment_robust_fixed.py
├── compensate_multiwindow_train_saved_params.py
├── compensate_multiwindow_turbo.py
├── compensate_multiwindow*.py
├── visualize_boundaries_and_frames.py
├── visualize_cumulative_compare.py
├── visualize_cumulative_weighted.py
├── scan_compensation_gui_cloud.py
├── show_envi_spectrum_gui.py
├── simple_raw_reader.py
├── comparisons/align_background_vs_reference_code/
├── align_data_vs_filter_code/
├── comparisons/alignment_configs/
├── versions/05_archive_code_variants/
├── comparisons/outputs_root/
├── comparisons/reference_filters/
├── comparisons/reference_spectrum_2835/
├── comparisons/reference_spectrum_lumileds/
├── references/
├── i18n/
└── versions/

Quick Start (5-Min Path)

If your environment is already prepared and your dataset folder contains a *event*.raw file:

scripts/run_scan_pipeline.sh /path/to/dataset_dir

To force a specific RAW file:

scripts/run_scan_pipeline.sh /path/to/dataset_dir /path/to/recording_event.raw

This wrapper runs segmentation, compensation training, and visualization using repository-default script paths and CLI flags.

[!TIP] For first validation, run the wrapper on one dataset directory, then inspect the generated segment NPZ and visualization outputs before tuning PIPELINE_* variables.

Prerequisites

  • Python 3.9+ (Python 3.10+ for some GUI tooling under ImagingGUI/).
  • Core Python packages: numpy, torch, matplotlib.
  • Optional but common: opencv-python, pillow, cellpose.
  • Metavision SDK / Python bindings for RAW event reading workflows (simple_raw_reader.py, segmentation from RAW).
  • CUDA-enabled PyTorch is recommended for faster optimization.
  • RAW recordings and/or segmented NPZ files available locally.

Installation

No locked environment file is currently provided at repository root. Suggested setup:

# create and activate a virtual environment or conda env
python -m venv .venv
source .venv/bin/activate

# install core dependencies
pip install numpy matplotlib torch

# optional tools often used in this repository
pip install opencv-python pillow
# pip install cellpose

If using Git hooks for large-file hygiene:

bash scripts/setup_hooks.sh

Usage

Basic Workflow (current root scripts)

# 1. Segment RAW into 6 scans (Forward/Backward)
python segment_robust_fixed.py \
  data/recording.raw \
  --segment_events \
  --output_dir data/segments/

# 2. Train multi-window compensation
python compensate_multiwindow_train_saved_params.py \
  data/segments/Scan_1_Forward_events.npz \
  --bin_width 50000 \
  --visualize --plot_params
View on GitHub
GitHub Stars101
CategoryProduct
Updated19d ago
Forks32

Languages

Python

Security Score

100/100

Audited on Mar 6, 2026

No findings