HYPERDOA
Repository for HYPERDOA: Robust and Efficient DoA Estimation using Hyperdimensional Computing
Install / Use
/learn @pwh9882/HYPERDOAREADME
HYPERDOA
Hyperdimensional Computing for Direction-of-Arrival/Angle-of-Arrival Estimation
Official implementation of the paper: "HYPERDOA: Robust and Efficient DoA Estimation using Hyperdimensional Computing" (arXiv:2510.10718) - To appear at ICASSP 2026, Barcelona, Spain!
News
- [2025/10] HYPERDOA preprint is released.
- [2026/01] HYPERDOA is accepted at ICASSP 2026.
Overview
HYPERDOA is a lightweight, efficient implementation of HDC-based direction-of-arrival (DOA) estimation for uniform linear arrays.
Installation
Using uv (Recommended)
# Clone the repository
git clone https://github.com/pwh9882/HYPERDOA.git
cd HYPERDOA
# Create virtual environment and install
uv venv
uv pip install -e .
# Or install with dev dependencies
uv pip install -e ".[dev]"
Using pip
git clone https://github.com/pwh9882/HYPERDOA.git
cd HYPERDOA
pip install -e .
Requirements
- Python >= 3.10
- PyTorch >= 2.0
- torch-hd >= 0.6.0
- numpy >= 1.24
- matplotlib >= 3.7
Quick Start
from hyperdoa import HDCAoAModel, DOAConfig, evaluate_hdc
import torch
# Configure system
config = DOAConfig(N=8, M=3, T=100, snr=-5)
# Load your dataset (list of (X, Y) tuples)
# X: Complex tensor (N, T) - sensor observations
# Y: Tensor (M,) - ground truth DOA in radians
train_data = torch.load("data/train_dataset.pt")
test_data = torch.load("data/test_dataset.pt")
# Train and evaluate
loss, model = evaluate_hdc(
train_data, test_data, config,
feature_type="lag", # or "spatial_smoothing"
return_model=True,
)
print(f"Test MSPE: {loss:.2f} dB")
Dataset Generation
HYPERDOA uses datasets generated by SubspaceNet. The dataset format is directly compatible - no conversion needed.
Dataset Format
Each dataset is a list of (X, Y) tuples:
| Field | Type | Shape | Description |
| ----- | ----------------- | -------- | ------------------------------- |
| X | torch.complex64 | (N, T) | Sensor observations |
| Y | torch.float64 | (M,) | Ground truth DOA in radians |
Generate Datasets Using Our Script
We provide a script that wraps SubspaceNet's data generation:
# Step 1: Clone SubspaceNet
git clone https://github.com/ShlezingerLab/SubspaceNet.git
# Step 2: Generate datasets with paper settings
uv run python scripts/generate_data.py \
--subspacenet-path ./SubspaceNet \
--output data/ \
--N 8 --M 3 --T 100 \
--snr -5 \
--train-samples 45000 \
--test-samples 2250
Example Configuration
The following parameters can be used as a starting point:
| Parameter | Value | Description |
| --------------- | ------------------------ | -------------------------- |
| N | 8 | Number of sensors |
| M | 3 or 4 | Number of sources |
| T | 100 | Number of snapshots |
| SNR | -5 to -1 dB or 1 to 5 dB | Signal-to-noise ratio (dB) |
| signal_nature | non-coherent / coherent | Signal correlation type |
| train_samples | 45,000 | Training set size |
| test_samples | 2,250 | Test set size |
Alternative: Use SubspaceNet Directly
You can also use SubspaceNet's own data generation:
# In SubspaceNet directory
import sys
sys.path.insert(0, '.')
from src.system_model import SystemModelParams
from src.data_handler import create_dataset
params = SystemModelParams()
params.N, params.M, params.T = 8, 3, 100
params.snr = -5
params.signal_nature = "non-coherent"
# Generate dataset
model_dataset, generic_dataset, samples_model = create_dataset(
system_model_params=params,
samples_size=45000,
model_type="RawX", # This gives us raw observations
save_datasets=False,
)
# generic_dataset is directly compatible with HYPERDOA!
# Save it:
import torch
torch.save(generic_dataset, "train_dataset.pt")
Feature Types
| Feature Type | Description |
| ------------------- | -------------------------------- |
| lag | Mean spatial-lag autocorrelation |
| spatial_smoothing | Spatial smoothing covariance |
Examples
# Generate data first
uv run python scripts/generate_data.py --subspacenet-path ./SubspaceNet
# Basic training
uv run python examples/train_basic.py --data-dir data/
API Reference
DOAConfig
Configuration dataclass for system parameters.
DOAConfig(
N=8, # Number of sensors
M=3, # Number of sources (3 or 4 in paper)
T=100, # Number of snapshots
snr=-5.0, # SNR in dB (-5 to -1 or 1 to 5 in paper)
signal_nature="non-coherent" # or "coherent"
)
HDCAoAModel
Main HDC model for DOA estimation.
model = HDCAoAModel(
N=8, M=3, T=100,
feature_type="lag",
n_dimensions=10000, # Hypervector dimensionality
min_angle=-90.0, # Minimum angle (degrees)
max_angle=90.0, # Maximum angle (degrees)
precision=0.1, # Angle resolution (degrees)
min_separation_deg=6.0, # Minimum peak separation
)
# Methods
model.train_from_dataloader(loader) # Train from DataLoader
model.fit(X, y) # Train from tensors
model.predict(X) # Predict angles (radians)
model.predict_multi(X, k=2) # Multi-source prediction
model.predict_logits(X) # Raw classification scores
model.compute_mspe_db(loader) # Compute MSPE (dB)
evaluate_hdc
Convenience function for training and evaluation.
loss, model = evaluate_hdc(
train_data, test_data, config,
feature_type="lag",
return_model=True,
verbose=True,
)
Citation
If you find HYPERDOA useful in your research, please cite:
@misc{hyperdoa_icassp,
title={HYPERDOA: Robust and Efficient DoA Estimation using Hyperdimensional Computing},
author={Rajat Bhattacharjya and Woohyeok Park and Arnab Sarkar and Hyunwoo Oh and Mohsen Imani and Nikil Dutt},
year={2025},
eprint={2510.10718},
archivePrefix={arXiv},
primaryClass={eess.SP},
url={https://arxiv.org/abs/2510.10718},
}
Acknowledgements
- Data generation utilities based on SubspaceNet
- HDC encoding using torchhd
License
HYPERDOA is licensed under MIT License. For questions or issues, feel free to open an issue. Thank you for using HYPERDOA!
