SkillAgentSearch skills...

Biomero

BIOMERO.analyzer - A python library for easy connecting between OMERO (jobs) and a Slurm cluster

Install / Use

/learn @NL-BioImaging/Biomero

README

BIOMERO — BioImage Analysis in OMERO

License DOI PyPI - Version PyPI - Python Versions Slurm OMERO fair-software.eu OpenSSF Best Practices Sphinx build pages-build-deployment python-package build python-publish build Coverage Status

🚀 This package is part of BIOMERO 2.0 — For complete deployment and FAIR infrastructure setup, start with the NL-BIOMERO Documentation 📖

The BIOMERO framework, for BioImage analysis in OMERO, allows you to run (FAIR) bioimage analysis workflows directly from OMERO on a high-performance compute (HPC) cluster, remotely through SSH.

BIOMERO 2.0

We have released an enhanced BIOMERO experience!

BIOMERO 2.0 is a complete ecosystem that includes:

  • BIOMERO.analyzer (this Python library) - The core analysis engine
  • BIOMERO.scripts - OMERO scripts for HPC integration
  • BIOMERO.importer - Automated data import service
  • OMERO.biomero - Modern web interface plugin

Full workflow tracking is now supported via a database and dashboard. The OMERO.biomero plugin provides an intuitive interface in OMERO.web. Every workflow run is uniquely identifiable, and resulting assets are accessible directly in OMERO.

NL-BIOMERO provides a full containerized deployment stack and documentation:

📊 Highlights

| Feature | BIOMERO 1.x | BIOMERO 2.0 | |---------|------------|-------------| | Workflow Tracking | Logs only | Full database events | | Interface | Scripts only | Modern web plugin + scripts | | Progress Monitoring | Manual | Live dashboards | | Job History | None | Complete execution history | | Analytics | None | Integrated Metabase |


BIOMERO Python library (BIOMERO.analyzer)

The BIOMERO framework consists of this Python library biomero (also known as BIOMERO.analyzer), together with the BIOMERO.scripts that can be run directly from the OMERO web interface.

The package includes the SlurmClient class, which provides SSH-based connectivity and interaction with a Slurm (high-performance compute) cluster. The package enables users to submit jobs, monitor job status, retrieve job output, and perform other Slurm-related tasks. Additionally, the package offers functionality for configuring and managing paths to Slurm data and Singularity images (think Docker containers...), as well as specific FAIR image analysis workflows and their associated repositories.

Overall, the biomero package simplifies the integration of HPC functionality within the OMERO platform for admins and provides an efficient and end-user-friendly interface towards both the HPC and FAIR workflows.

⚠️ Warning: Default settings are intended for short/medium jobs. For long workflows (>45min) please change some default settings:


Overview

In the figure below we show our BIOMERO framework, for BioImage analysis in OMERO.

BIOMERO 1.0 consists of this Python library (biomero) and the integrations within OMERO, currently through our BIOMERO.scripts.

OMERO-Figure1_Overview_v5

For the BIOMERO 2.0 setup, see NL-BIOMERO for deployment; and for details on the design and FAIR features, see our latest preprint: “BIOMERO 2.0: end-to-end FAIR infrastructure for bioimaging data import, analysis, and provenance”

Deploy with NL-BIOMERO

For the easiest deployment and integration with other FAIR infrastructure, use the NL-BIOMERO stack:

  • NL-BIOMERO deployment repo: https://github.com/NL-BioImaging/NL-BIOMERO
  • OMERO.biomero OMERO.web plugin: https://github.com/NL-BioImaging/OMERO.biomero
  • Prebuilt BIOMERO processor container: https://hub.docker.com/r/cellularimagingcf/biomero

Quickstart

For a quick overview of what this library can do for you, we can install an example setup locally with Docker:

  1. Setup a local OMERO w/ this library:
    • Follow Quickstart of https://github.com/NL-BioImaging/NL-BIOMERO
  2. Setup a local Slurm w/ SSH access:
    • Follow Quickstart of https://github.com/NL-BioImaging/NL-BIOMERO-Local-Slurm
  3. Upload some data with OMERO.insight to localhost server (... we are working on a web importer ... TBC)
  4. Try out some scripts from https://github.com/NL-BioImaging/biomero-scripts (already installed in step 1!):
    1. Run script biomero > admin > SLURM Init environment...
    2. Get a coffee or something. This will take at least 10 min to download all the workflow images.
    3. Select your image / dataset and run script biomero > __workflows >SLURM Run Workflow...
      • Select at least one of the Select how to import your results, e.g. change Import into NEW Dataset text to hello world
      • Select a fun workflow, e.g. cellpose.
        • Change the nuc channel to the channel to segment (note that 0 is for grey, so 1,2,3 for RGB)
        • Uncheck the use gpu (step 2, our HPC cluster, doesn't come with GPU support built into the containers)
      • Refresh your OMERO Explore tab to see your hello world dataset with a mask image when the workflow is done!

Prerequisites & Getting Started with BIOMERO

Installation

Using pip (Cross-platform)

BIOMERO can be installed via pip with different dependency sets:

# Basic library only (core BIOMERO functionality)
python3 -m pip install biomero

# With BIOMERO.scripts requirements (includes BIOMERO.scripts dependencies)
python3 -m pip install biomero[full]

# Latest development version with BIOMERO.scripts requirements
python3 -m pip install 'git+https://github.com/NL-BioImaging/biomero[full]'

# With test dependencies (for local testing/development)
python3 -m pip install biomero[test]

# With both test and BIOMERO.scripts requirements
python3 -m pip install biomero[test,full]

Dependency sets explained:

  • Default (no extras): Core BIOMERO library for basic functionality
  • [test]: Adds pytest, coverage tools for local testing and development
  • [full]: Adds BIOMERO.scripts requirements (ezomero, tifffile, omero-metadata, omero-cli-zarr) which require complex dependencies like zeroc-ice and omero-py

Note: The [full] dependencies require complex packages like zeroc-ice and omero-py that need system libraries. For OMERO integration, you may need to install these separately or use conda.

Slurm Requirements

Note: This library has only been tested on Slurm versions 21.08.6 and 22.05.09 !

Your Slurm cluster/login node needs to have:

  1. SSH access w/ public key (headless)
  2. SCP access (generally comes with SSH)
  3. 7zip installed
  4. Singularity/Apptainer installed
  5. (Optional) Git installed, if you want your own job scripts
  6. Slurm accounting enabled

OMERO Requirements

Your OMERO processing node needs to have:

  1. SSH client and access to the Slurm cluster (w/ private key / headless)
  2. SCP access to the Slurm cluster
  3. Python3.10+
  4. This library installed
    • With BIOMERO.scripts dependencies: python3 -m pip install biomero[full]
    • Latest development: python3 -m pip install 'git+https://github.com/NL-BioImaging/biomero[full]'
  5. Configuration setup at /etc/slurm-config.ini

Your OMERO server node needs to have:

  1. Some OMERO example scripts installed to interact with this library:
    • My examples on github: https://github.com/NL-BioImaging/biomero-scripts
    • Install those at /opt/omero/server/OMERO.server/lib/scripts/slurm/, e.g. `git clone https://github.com/NL-BioImaging/biomero-scripts.git <pat

Related Skills

View on GitHub
GitHub Stars25
CategoryContent
Updated4d ago
Forks5

Languages

Python

Security Score

95/100

Audited on Mar 26, 2026

No findings