SkillAgentSearch skills...

Opyrator

🪄 Turns your machine learning code into microservices with web API, interactive GUI, and more.

Install / Use

/learn @ml-tooling/Opyrator

README

<!-- markdownlint-disable MD033 MD041 --> <h1 align="center"> Opyrator </h1> <p align="center"> <strong>Turns your Python functions into microservices with web API, interactive GUI, and more.</strong> </p> <p align="center"> <a href="https://pypi.org/project/opyrator/" title="PyPi Version"><img src="https://img.shields.io/pypi/v/opyrator?color=green&style=flat"></a> <a href="https://pypi.org/project/opyrator/" title="Python Version"><img src="https://img.shields.io/badge/Python-3.6%2B-blue&style=flat"></a> <a href="https://github.com/ml-tooling/opyrator/blob/main/LICENSE" title="Project License"><img src="https://img.shields.io/badge/License-MIT-green.svg"></a> <a href="https://github.com/ml-tooling/opyrator/actions?query=workflow%3Abuild-pipeline" title="Build status"><img src="https://img.shields.io/github/workflow/status/ml-tooling/opyrator/build-pipeline?style=flat"></a> <a href="ttps://mltooling.substack.com/subscribe" title="Subscribe to newsletter"><img src="http://bit.ly/2Md9rxM"></a> <a href="https://twitter.com/mltooling" title="Follow on Twitter"><img src="https://img.shields.io/twitter/follow/mltooling.svg?style=social&label=Follow"></a> </p> <p align="center"> <a href="#getting-started">Getting Started</a> • <a href="#features">Features</a> • <a href="#examples">Examples</a> • <a href="#support--feedback">Support</a> • <a href="https://github.com/ml-tooling/opyrator/issues/new?labels=bug&template=01_bug-report.md">Report a Bug</a> • <a href="#contribution">Contribution</a> • <a href="https://github.com/ml-tooling/opyrator/releases">Changelog</a> </p>

Instantly turn your Python functions into production-ready microservices. Deploy and access your services via HTTP API or interactive UI. Seamlessly export your services into portable, shareable, and executable files or Docker images. Opyrator builds on open standards - OpenAPI, JSON Schema, and Python type hints - and is powered by FastAPI, Streamlit, and Pydantic. It cuts out all the pain for productizing and sharing your Python code - or anything you can wrap into a single Python function.

<sup>Alpha Version: Only suggested for experimental usage.</sup>

<img style="width: 100%" src="https://raw.githubusercontent.com/ml-tooling/opyrator/main/docs/images/opyrator-header.png"/>
<p align="center"> Try out and explore various examples in our playground <a href="https://opyrator-playground.mltooling.org">here</a>. </p>

Highlights

  • 🪄  Turn functions into production-ready services within seconds.
  • 🔌  Auto-generated HTTP API based on FastAPI.
  • 🌅  Auto-generated Web UI based on Streamlit.
  • 📦  Save and share as self-contained executable file or Docker image.
  • 🧩  Reuse pre-defined components & combine with existing Opyrators.
  • 📈  Instantly deploy and scale for production usage.

Getting Started

Installation

Requirements: Python 3.6+.

pip install opyrator

Usage

  1. A simple Opyrator-compatible function could look like this:

    from pydantic import BaseModel
    
    class Input(BaseModel):
        message: str
    
    class Output(BaseModel):
        message: str
    
    def hello_world(input: Input) -> Output:
        """Returns the `message` of the input data."""
        return Output(message=input.message)
    

    💡 An Opyrator-compatible function is required to have an input parameter and return value based on Pydantic models. The input and output models are specified via type hints.

  2. Copy this code to a file, e.g. my_opyrator.py

  3. Run the UI server from command-line:

    opyrator launch-ui my_opyrator:hello_world
    

    In the output, there's a line that shows where your web app is being served, on your local machine.

    <img style="width: 100%" src="https://raw.githubusercontent.com/ml-tooling/opyrator/main/docs/images/opyrator-hello-world-ui.png"/>
  4. Run the HTTP API server from command-line:

    opyrator launch-api my_opyrator:hello_world
    

    In the output, there's a line that shows where your web service is being served, on your local machine.

    <img style="width: 100%" src="https://raw.githubusercontent.com/ml-tooling/opyrator/main/docs/images/opyrator-hello-world-api.png"/>
  5. Find out more usage information in the Features section or get inspired by our examples.

Examples


<p align="center"> 👉&nbsp; Try out and explore these examples in our playground <a href="https://opyrator-playground.mltooling.org">here</a> </p>

The following collection of examples demonstrate how Opyrator can support a variety of different tasks and use-cases. All these examples are bundled into a demo playground which you can also deploy on your own machine via Docker:

docker run -p 8080:8080 mltooling/opyrator-playground:latest

Text Generation

<img style="width: 100%" src="https://raw.githubusercontent.com/ml-tooling/opyrator/main/docs/images/text-generation-demo.png"/> <details> <summary>Run this demo on your machine (click to expand...)</summary>

To run the demo on your local machine just execute the following commands:

git clone https://github.com/ml-tooling/opyrator
cd ./opyrator/examples/generate_text/
pip install -r requirements.txt
opyrator launch-ui app:generate_text --port 8051

Visit http://localhost:8051 in your browser to access the UI of the demo. Use launch-api instead of launch-ui to launch the HTTP API server.

</details>

Question Answering

<img style="width: 100%" src="https://raw.githubusercontent.com/ml-tooling/opyrator/main/docs/images/question-answering-demo.png"/> <details> <summary>Run this demo on your machine (click to expand...)</summary>

To run the demo on your local machine just execute the following commands:

git clone https://github.com/ml-tooling/opyrator
cd ./opyrator/examples/question_answering/
pip install -r requirements.txt
opyrator launch-ui app:question_answering --port 8051

Visit http://localhost:8051 in your browser to access the UI of the demo. Use launch-api instead of launch-ui to launch the HTTP API server.

</details>

Image Super Resolution

<img style="width: 100%" src="https://raw.githubusercontent.com/ml-tooling/opyrator/main/docs/images/image-super-resolution-demo.png"/> <details> <summary>Run this demo on your machine (click to expand...)</summary>

To run the demo on your local machine just execute the following commands:

git clone https://github.com/ml-tooling/opyrator
cd ./opyrator/examples/image_super_resolution/
pip install -r requirements.txt
opyrator launch-ui app:image_super_resolution --port 8051

Visit http://localhost:8051 in your browser to access the UI of the demo. Use launch-api instead of launch-ui to launch the HTTP API server.

</details>

Text Preprocessing

<img style="width: 100%" src="https://raw.githubusercontent.com/ml-tooling/opyrator/main/docs/images/text-preprocessing-demo.png"/> <details> <summary>Run this demo on your machine (click to expand...)</summary>

To run the demo on your local machine just execute the following commands:

git clone https://github.com/ml-tooling/opyrator
cd ./opyrator/examples/preprocess_text/
pip install -r requirements.txt
opyrator launch-ui app:preprocess_text --port 8051

Visit http://localhost:8051 in your browser to access the UI of the demo. Use launch-api instead of launch-ui to launch the HTTP API server.

</details>

Language Detection

<img style="width: 100%" src="https://raw.githubusercontent.com/ml-tooling/opyrator/main/docs/images/language-detection-demo.png"/> <details> <summary>Run this demo on your machine (click to expand...)</summary>

To run the demo on your local machine just execute the following commands:

git clone https://github.com/ml-tooling/opyrator
cd ./opyrator/examples/detect_language/
pip install -r requirements.txt
opyrator launch-ui app:detect_la
View on GitHub
GitHub Stars3.1k
CategoryOperations
Updated5d ago
Forks166

Languages

Python

Security Score

100/100

Audited on Mar 23, 2026

No findings