Viv
Library for multiscale visualization of high-resolution multiplexed bioimaging data on the web. Directly renders Zarr and OME-TIFF.
Install / Use
/learn @hms-dbmi/VivREADME
Viv

A WebGL-powered toolkit for interactive visualization of high-resolution, multiplexed bioimaging datasets.
<p align="center"> <img src="https://github.com/hms-dbmi/viv/raw/main/sites/docs/src/3d-slicing.gif" alt="Interactive volumetric view in web browser; sliders control visible planes." width="400"/> <img src="https://github.com/hms-dbmi/viv/raw/main/sites/docs/src/glomerular-lens.png" alt="Multi-channel rendering of high-resolution microscopy dataset" width="400"/> </p>About
Viv is a JavaScript library for rendering OME-TIFF and OME-NGFF (Zarr) directly in the browser. The rendering components of Viv are packaged as deck.gl layers, making it easy to compose with existing layers to create rich interactive visualizations.
More details and related work can be found in our paper and original preprint. Please cite our paper in your research:
Trevor Manz, Ilan Gold, Nathan Heath Patterson, Chuck McCallum, Mark S Keller, Bruce W Herr II, Katy Börner, Jeffrey M Spraggins, Nils Gehlenborg, "Viv: multiscale visualization of high-resolution multiplexed bioimaging data on the web." Nature Methods (2022), doi:10.31219/10.1038/s41592-022-01482-7
💻 Related Software
| Screenshot | Description |
:-------------------------:|:-------------------------:
<img src="https://github.com/hms-dbmi/viv/raw/main/sites/docs/src/avivator-browser.png" alt="Avivator viewer running in Chrome"/> | Avivator <br> A lightweight viewer for local and remote datasets. The source code is include in this repository under avivator/. See our 🎥 video tutorial to learn more.
<img src="https://github.com/hms-dbmi/viv/raw/main/sites/docs/src/vizarr-browser.png" alt="Vizarr viewer running in Jupyter Notebook"/> | Vizarr <br> A minimal, purely client-side program for viewing OME-NGFF and other Zarr-based images. Vizarr supports a Python backend using the imjoy-rpc, allowing it to not only function as a standalone application but also directly embed in Jupyter or Google Colab Notebooks.
💥 In Action
- Vitessce visualization framework
- HuBMAP Common Coordination Framework Exploration User Interface (CCF EUI)
- OME-Blog OME-NGFF and OME-NGFF HCS announcements
- ImJoy I2K Tutorial
- Galaxy Project includes Avivator as default viewer for OME-TIFF files
- 10x Genomics uses Viv in their viewer for Xenium In Situ Analysis Technology: demo
💾 Supported Data Formats
Viv's data loaders support OME-NGFF (Zarr), OME-TIFF, and Indexed OME-TIFF*.
We recommend converting proprietrary file formats to open standard formats via the
bioformats2raw + raw2ometiff pipeline. Non-pyramidal datasets are also supported
provided the individual texture can be uploaded to the GPU (< 4096 x 4096 in pixel size).
Please see the tutorial for more information.
*We describe Indexed OME-TIFF in our paper as an optional enhancement to provide efficient random chunk access for OME-TIFF. Our approach substantially improves chunk load times for OME-TIFF datasets with large Z, C, or T dimensions that otherwise may incur long latencies due to seeking. More information on generating an IFD index (JSON) can be found in our tutorial or documentation.
💽 Installation
$ npm install @hms-dbmi/viv
You will also need to install deck.gl and other peerDependencies manually.
This step prevent users from installing multiple versions of deck.gl in their projects.
$ npm install deck.gl @luma.gl/core
Breaking changes may happen on the minor version update. Please see the changelog for information.
📖 Documentation
Detailed API information and example sippets can be found in our documentation.
🏗️ Development
This repo is a monorepo using pnpm workspaces. The package manager used to install and link dependencies must be pnpm.
Each folder under packages/ are a published as a separate packages on npm under the @vivjs scope. The top-level package @hms-dbmi/viv exports from these dependencies.
To develop and test the @hms-dbmi/viv package:
- Run
pnpm installinvivroot folder - Run
pnpm devto start a development server - Download OME-Zarr files used as test fixtures - see ./packages/loaders/tests/fixtures/ome-zarr/README.md.
- Run
pnpm testto run all tests (or specific, e.g.,pnpm test --filter=@vivjs/layers)
🛠️ Build
To build viv's documentation and the Avivator website (under sites/), run:
pnpm build # all packages, avivator, and documentation
pnpm --filter "avivator" build # build a specific package or site
📄 Sending PRs and making releases
For changes to be reflected in package changelogs, run npx changeset and follow the prompts.
Note not every PR requires a changeset. Since changesets are focused on releases and changelogs, changes to the repository that don't effect these won't need a changeset (e.g., documentation, tests).
The Changesets GitHub Action will create and update a PR that applies changesets versions of @vivjs/ packages to NPM.
Note: If the release includes a new NPM package (e.g., @vivjs/new-thing), OIDC trusted publishing must first be configured via NPM (see discussion).
🌎 Browser Support
Viv supports coverage across Safari, Firefox, Chrome, and Edge. Please file an issue if you find a browser in which Viv does not work.
Related Skills
node-connect
341.2kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
84.5kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
341.2kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
commit-push-pr
84.5kCommit, push, and open a PR
