SkillAgentSearch skills...

Mitransient

Code for "Non-line-of-sight transient rendering" in Mitsuba 3 - Full implementation of transient path tracing - pip install mitransient

Install / Use

/learn @diegoroyo/Mitransient

README

<div align="center"> <img align="center" src="https://github.com/mitsuba-renderer/mitsuba2/raw/master/docs/images/logo_plain.png" width="90" height="90"/> </div> <!-- PROJECT LOGO --> <p align="center"> <h1 align="center">mitransient</h1> <h3 align="center">Transient light transport in Mitsuba 3 <br><br> <a href='https://mitransient.readthedocs.io/en/latest/?badge=latest'> <img src='https://readthedocs.org/projects/mitransient/badge/?version=latest' alt='Documentation Status' /><a href='https://pypi.org/project/mitransient/'> <img src='https://img.shields.io/pypi/v/mitransient.svg?color=green' alt='PyPI version' /> </a> <a href='https://arxiv.org/abs/2510.25660'> <img src='https://img.shields.io/badge/arXiv-2510.25660-b31b1b.svg' alt='arXiv Paper' /> </a></h3> </p> <div align="center"> <img src="https://raw.githubusercontent.com/diegoroyo/mitransient/main/.images/cornell-box.png" width="200" height="200"/> <img src="https://raw.githubusercontent.com/diegoroyo/mitransient/main/.images/cornell-box.gif" width="200" height="200"/> <img src="https://raw.githubusercontent.com/diegoroyo/mitransient/main/.images/nlos-Z.png" width="200" height="200"/> <img src="https://raw.githubusercontent.com/diegoroyo/mitransient/main/.images/nlos-Z.gif" width="200" height="200"/> <br> <img src="https://raw.githubusercontent.com/diegoroyo/mitransient/main/.images/polarization.gif" width="320" height="240"/> <img src="https://raw.githubusercontent.com/diegoroyo/mitransient/main/.images/staircase_steady.png" width="160" height="240"/> <img src="https://raw.githubusercontent.com/diegoroyo/mitransient/main/.images/staircase_transient.gif" width="160" height="240"/> <img src="https://raw.githubusercontent.com/diegoroyo/mitransient/main/.images/staircase_diff.gif" width="160" height="240"/> </div> <br />

Overview

mitransient is a library adds support to Mitsuba 3 for doing transient simulations, with amazing support for non-line-of-sight (NLOS) data capture simulations, polarization tracking and differentiable transient rendering.

<br>

[!TIP] Check out our <a href="https://mitransient.readthedocs.io">online documentation (mitransient.readthedocs.io)</a> and our code examples: <br>

Main features

  • Foundation ready to use: easy interface to convert your algorithms to the transient domain.
  • Python-only library for doing transient rendering in both CPU and GPU.
  • Several integrators already implemented: transient pathtracing (also adapted for NLOS scenes) and transient volumetric pathtracing.
  • Cross-platform: Mitsuba 3 has been tested on Linux (x86_64), macOS (aarch64, x86_64), and Windows (x86_64).
  • Polarization tracking
  • Differentiable transient rendering
<br>

License and citation

This project was started by Diego Royo, Miguel Crespo and Jorge Garcia-Pueyo. See below for the full list of mitransient contributors. Also see the original Mitsuba 3 license and contributors.

If you use our code in your project, please consider citing us using the following:

@misc{royo2025mitransient,
      title={mitransient: Transient light transport in Mitsuba 3}, 
      author={Diego Royo and Jorge Garcia-Pueyo and Miguel Crespo and Guillermo Enguita and Óscar Pueyo-Ciutad and Diego Bielsa},
      year={2025},
      eprint={2510.25660},
      archivePrefix={arXiv},
      primaryClass={cs.GR},
      url={https://arxiv.org/abs/2510.25660}, 
}

Additionally, the NLOS features were implemented from our publication Non-line-of-sight transient rendering. Please also consider citing us if you use them:

@article{royo2022non,
	title        = {Non-line-of-sight transient rendering},
	author       = {Diego Royo and Jorge García and Adolfo Muñoz and Adrian Jarabo},
	year         = 2022,
	journal      = {Computers & Graphics},
	doi          = {https://doi.org/10.1016/j.cag.2022.07.003},
	issn         = {0097-8493},
	url          = {https://www.sciencedirect.com/science/article/pii/S0097849322001200}
}

What is transient rendering?

Conventional rendering is referred to as steady state, where the light propagation speed is assumed to be infinite. In contrast, transient rendering breaks this assumption allowing us to simulate light in motion (see the teaser image for a visual example).

For example, path tracing algorithms integrate over multiple paths that connect a light source with the camera. For a known path, transient path tracing uses the very complex formula of time = distance / speed (see [Two New Sciences by Galileo]) to compute the time when each photon arrives at the camera from the path's distance and light's speed. This adds a new time dimension to the captured images (i.e. it's a video now). The simulations now take new parameters as input: when to start recording the video, how long is each time step (framerate), and how many frames to record.

Note: note that the time values we need to compute are very small (e.g. light takes only ~3.33 * 10^-9 seconds to travel 1 meter), time is usually measured in optical path distance. See Wikipedia for more information. TL;DR opl = distance * refractive_index

Installation

We provide the package via PyPI. To install mitransient you need to run:

pip install mitransient

which will also install the mitsuba Python package as a dependency.

[!IMPORTANT] mitransient and mitsuba have different variants that specify the number of channels (RGB image, monochromatic, etc.), hardware acceleration (execution in CPU, GPU, etc.). If you install mitransient/mitsuba via pip, you will have access to the following variants specified in this website. There are more variants available, but you will have to compile Mitsuba 3 yourself.

[!TIP] If you wish to use your own compiled Mitsuba 3, see the section below "If you use your own Mitsuba 3".

Requirements

  • Python >= 3.8
  • mitsuba >= 3.6.0
  • (optional) For computation on the GPU: Nvidia driver >= 495.89
  • (optional) For vectorized / parallel computation on the CPU: LLVM >= 11.1

After installation

At this point, you should be able to import mitsuba and import mitransient in your Python code (careful about setting the correct PATH environment variable if you have compiled Mitsuba 3 yourself, see the section below).

For NLOS data capture simulations, see https://github.com/diegoroyo/tal. tal is a toolkit that allows you to create and simulate NLOS scenes with an easier shell interface instead of directly from Python.

If you use your own Mitsuba 3

First, you will want to install mitransient without the mitsuba dependency:

pip install mitransient --no-dependencies

Then you will need to import mitsuba in your Python scripts. Concretely, the PYTHONPATH variable should point to the Mitsuba module that is built upon compilation. There are different ways to do so:

  • One solution is to directly execute setpath.sh provided after the compilation of the Mitsuba 3 repo (More info). This shell script will modify the PATH and PYTHONPATH variables to load first this version of Mitsuba.
  • Another solution following the previous one is to directly set yourself the PYTHONPATH environment variable as you wish.
  • Another solution for having a custom version globally available is by using pip install . --editable. This will create a symlink copy of the package files inside the corresponding site-packages folder and will be listed as a package installed of pip and will be available as other packages installed. If you recompile them, you will still have the newest version directly to use. Please follow these instructions:
    • Go to <mitsuba-path>/mitsuba3/build/python/drjit and execute pip install . --editable.
    • Go to <mitsuba-path>/mitsuba3/build/python/mitsuba and execute pip install . --editable.
  • If you are a user of Jupyter Notebooks, the easiest solution will be to add the following snippet of code to modify the notebook's PYTHONPATH:
import sys
sys.path.insert(0, '<mitsuba-path>/mitsuba3/build/python')
import mitsuba as mi

Usage

[!TIP] Check out the examples folder for practical usage! <br>

You are now prepared to render your first transient scene with mitransient. Running the code below will render the famous Cornell Box scene in transient domain and show a video.

import mitsuba as mi
mi.set_variant('llvm_ad_rgb')
import mitransient as mitr

scene = mi.load_dict(mitr.cornell_box())
img_steady, img_transient = mi.render(scene, spp=1024)

img_transient = mitr.vis.tonemap_transient(img_transient)
mitr.vis.show_video(img_transient)

Plugins implemented

[!TIP] You can also look at the plugins' documentation in our online documentation. Look on the left menu for Integrators, Films, Emitters and Sensors.

mitransient implements the following plugins which can be used in scene

Related Skills

View on GitHub
GitHub Stars69
CategoryDevelopment
Updated13d ago
Forks10

Languages

Python

Security Score

85/100

Audited on Mar 19, 2026

No findings