SkillAgentSearch skills...

Fractalic

Fractalic: Build and version-control AI systems using Markdown & YAML. Combine LLM calls, shell commands, and modular workflows in a human-readable format. Docker-first installation, Git-native tracking.

Install / Use

/learn @fractalic-ai/Fractalic
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

PyPI Version License: MIT Status Python Docs

<p align="center"> <img src="https://raw.githubusercontent.com/fractalic-ai/fractalic/main/docs/images/fractalic_hero.png" alt="Fractalic Hero Image"> </p>

What is Fractalic?

Design, run and evolve multi‑model AI workflows in one executable Markdown file—no glue code—with precise context control, tool integration and git‑traceable reproducibility.

What's New in v0.1.x

This update focuses on making Fractalic more practical for everyday use. We added better model options, tool handling, and ways to deploy and debug. Here's a rundown of the changes.

🧠 AI & Model Support

  • 🤖 LiteLLM integration, supporting over 100 models and providers.
  • 🔄 Scripts now work as complete agents, with two-way parameter passing in LLM modes.
  • 📊 Basic token tracking and cost analytics (still in early stages).
  • 🧠 Improved context diffs (.ctx) for multi-model workflows.

⚡ MCP & Tool Ecosystem

  • ⚡ Full MCP support, including schema caching.
  • 🔐 OAuth 2.0 and token management for MCP services.
  • 🛒 MCP marketplace in Fractalic Studio for one-click installs.
  • 🔧 Fractalic Tools marketplace with one-click options: Telegram, HubSpot CRM, Tavily web search, MarkItDown, HubSpot process-mining, ultra-fast grep, file patching, and others.
  • 🐍 Support for using Python modules as tools.
  • 👁️ Tool call tracing, available in context and through the Studio inspector.

🚀 Deployment & Publishing

  • 🚀 Publisher system with Docker builds and a lightweight server for REST APIs, including Swagger docs.
  • 🐳 Automated deployments with process supervision.
  • 📦 Fractalic now available as a Python package for standalone use or importing as a module.

🎨 Fractalic Studio (IDE)

  • 🖥️ Development environment with session views, diff inspector, editor, and deployment tools.
  • 📝 Notebook-style editor for building workflows step by step.
  • 🛒 Integrated marketplaces for MCP servers and tools.
  • 🔍 Debugging features like execution tracing and context inspection.

📚 Documentation & Stability

  • 📖 Detailed docs covering all features and examples.
  • 🛠️ Better stability for tool executions, with improved structured outputs.

Table of Contents

Getting Started

Installation

Method 1: Pre-Built Docker Image (Recommended)

Run the published container directly with all services (UI + API + AI server):

docker run -d --name fractalic --network bridge -p 3000:3000 -p 8000:8000 -p 8001:8001 -p 5859:5859 -v /var/run/docker.sock:/var/run/docker.sock --env HOST=0.0.0.0 ghcr.io/fractalic-ai/fractalic:main

Then open: http://localhost:3000

Method 2: Build from Source (Full Stack)

Builds latest version from GitHub repositories and runs in Docker:

curl -s https://raw.githubusercontent.com/fractalic-ai/fractalic/main/deploy/docker-deploy.sh | bash

This clones both fractalic + fractalic-ui, builds Docker image locally, and starts all services:

  • UI: http://localhost:3000
  • API: http://localhost:8000
  • AI Server: http://localhost:8001
  • MCP Manager: http://localhost:5859

Method 3: Local Development Setup

Full source installation with both backend and frontend for development:

git clone https://github.com/fractalic-ai/fractalic.git
cd fractalic
./local-dev-setup.sh

This script will:

  • Clone fractalic-ui repository
  • Set up Python virtual environment
  • Install all dependencies
  • Start both backend and frontend servers
  • Open http://localhost:3000 automatically

Method 4: Python Package - CLI

Install for command-line usage (no UI):

pip install fractalic

Check install:

fractalic --help

Run a workflow file:

fractalic your_workflow.md

Method 5: Python Package - API

Install for programmatic usage in Python:

pip install fractalic

Then use in your Python code:

import fractalic

# Run a workflow file
result = fractalic.run_fractalic('workflow.md')

# Run with user input parameters
result = fractalic.run_fractalic('workflow.md', param_input_user_request='Tesla analysis')

# Run with custom model and API key
result = fractalic.run_fractalic(
    'workflow.md', 
    model='openai/gpt-4',
    api_key='your-api-key'
)

# Result is a dictionary with execution details
print(f"Success: {result['success']}")
print(f"Branch: {result['branch_name']}")
print(f"Context file: {result['ctx_file']}")
print(f"Context hash: {result['ctx_hash']}")

Basic Principles

  • One executable Markdown file: Your workflow specification is your runtime. Write what you want in plain Markdown, run it directly. No translation between documentation and code.

  • No glue code: Replace Python/JS/(any program language) orchestration scripts with 3-6 line YAML plain-text operations.

  • Multi-model workflows: Switch between LLM models and providers in the same document.

  • Precise context control: Your Markdown becomes a manageable LLM context as an addressable tree. Reference exact sections, branches, or lists. LLMs see only what you specify—no hidden prompt stuffing.

  • Tool integration: Connect MCP servers, Python functions, and shell commands. All outputs flow back into your document structure for the next operation.

  • Human‑readable audit trail: Each run outputs a stepwise execution tree plus a complete change log (new blocks, edits, tool calls). Skim it like a focused diff—only actions and their effects, no noise.

Fractalic Operations

Fractalic is built around a set of key operations. These are deterministic instructions that the Fractalic interpreter executes in sequence to process your workflow. Each operation performs a specific task, like calling an AI model, running a shell command, or manipulating the document's content.

  • @llm – Sends specified blocks of content to any supported language model, including local models.
  • @shell – Runs terminal commands, with the output captured as a new block in the document.
  • @run – Executes another Fractalic Markdown file, allowing you to modularize workflows, pass parameters, and receive results.
  • @import – Includes content from other files directly into your document.
  • @return – Sends specified blocks back as a result to a parent workflow that used the @run operation.

Each operation can be customized with a variety of parameters. For a detailed reference of all available options, please see the Operations Reference documentation.

Basic Examples

The following examples demonstrate how you can combine Fractalic's operations (using YAML syntax) with your knowledge (written as Markdown blocks) to create powerful, automated workflows. You'll see how to integrate with external tools, generate and manipulate content, and even create workflows that write themselves.

Note on examples: The execution results are shown as a diff. Green highlighted text (+) represents the new content added to the document after the workflow runs. The + markers are an artifact of GitHub's diff formatting and won't appear in the actual output file. Some tool outputs have been truncated for brevity.

Example: Web Search → Notion Page (MCP Integration)

This example demonstrates a complete workflow: it uses the tavily_search tool to find information on the web, then passes the structured results to the mcp/notion tool to create a new, formatted page in Notion. This showcases how Fractalic can chain different services together to automate a research and publishing task.

# Web search task
Find top-5 world news for today about AI, provide brief summary about each, print them under "# AI news" header (add empty line before it) and suppliment each with direct link

@llm
prompt: Search news 
tools: tavily_search

# Notion task
Based on extracted news, extract important insights, keep for each news a direct link - and save them as newspaper (please format it properly) to my Notion, create new page there - Daily AI news

@llm
prompt: Process news to Notion
block: 
    - notion-task
    - ai-news
tools: mcp/notion

Execution result:

# Web search task
Find top world news for today about AI, provide brief summary about each, print them under "# AI news" header (add empty line before it) and suppliment each with direct link

@llm
prompt: Search news 
tools: tavily_search

+ # LLM response block
+ 
+ > TOOL CALL, id: call_7fl4HiwuAV7crDV9TNJyyCu1
+ tool: tavily_search
+ args:
+ {
+   "task": "sea
View on GitHub
GitHub Stars63
CategoryDevelopment
Updated10d ago
Forks25

Languages

Python

Security Score

100/100

Audited on Mar 28, 2026

No findings