SkillAgentSearch skills...

Bridgic

Bridgic is the next-generation agent development framework for building intelligent systems.

Install / Use

/learn @bitsky-tech/Bridgic
About this skill

Quality Score

0/100

Supported Platforms

Claude Code
Claude Desktop
Gemini CLI

README

<p align="center"> <picture> <img src="https://raw.githubusercontent.com/bitsky-tech/bridgic/refs/heads/main/docs/docs/assets/logo_white_bg.svg" alt="Bridgic" width="500"> </picture> </p>

License MIT Python Version PyPI Version PyPI Downloads

Bridgic is the next-generation agent development framework for building intelligent systems.

By redefining the boundary between workflows and agents, Bridgic introduces a unified orchestration and runtime model which enables developers to seamlessly transition between predictable workflows and autonomous, creative agents within one system.

✨ The name "Bridgic" embodies our core philosophy — "Bridging Logic and Magic", where:

  • Logic represents deterministic and predictable execution flows, forming the foundation of reliable systems.
  • Magic refers to the autonomous parts that can make dynamic decisions and solve problems creatively.
<div align="center">
graph
    subgraph " "
        A["Deterministic Workflows<br/>(Logic)"]
        B["Autonomous Agents<br/>(Magic)"]
    end
    
    A ---> B
    B ---> A
    
    style A fill:#f9f9f9,stroke:#333,stroke-width:2px
    style B fill:#f9f9f9,stroke:#333,stroke-width:2px
</div>

📦 Installation

Bridgic requires Python 3.9 or newer.

Using pip

pip install bridgic
python -c "from bridgic.core import __version__; print(f'Bridgic version: {__version__}')"

Using uv

uv add bridgic
uv run python -c "from bridgic.core import __version__; print(f'Bridgic version: {__version__}')"

🔗 Key Features

🌀 Core Runtime Engine

  • Unified DDG Foundation: Both deterministic workflows and autonomous agents are orchestrated through the same Dynamic Directed Graph runtime model, providing a unified foundation for intelligent systems.
  • Dynamic Topology & Routing: The mechanism of dynamic topology allows graph structure to be modified at runtime, while dynamic routing enables conditional branching through an intuitive ferry_to() API.
  • Multi-Layered Orchestration APIs: Unified under DDG, Bridgic provides a Declarative API and ASL (Agent Structure Language), which gives developers flexibility in structuring their code.

🚀 Consistent Development Experience for Workflows and Agents

  • Program-defined Execution Mode: Build deterministic workflows with explicit control flow, where execution paths are defined by your code structure.
  • Model-driven Autonomous Execution Mode: Leverage ReCentAutoma and other autonomous agent modules where LLMs make dynamic decisions about tool selection and execution paths.

🧩 Modular Development Support

  • Modular Application Building: Complex intelligent systems can be composed through modularity, enabling component reusing and hierarchical nesting.
  • Parameter Resolving & Binding: The mechanism of parameter resolving enables passing data among workers/automas, eliminating the complexity of global state management.
  • Agent Structure Language: ASL is a Python-native DSL that enables developers to express sophisticated agentic structures within a limited amount of code, optimized for AI-assisted development.

👥 Powerful Human-in-the-Loop Support

  • Human Interaction Based on Asynchronous Awaiting: Systems can pause execution and await human feedback asynchronously, enabling seamless integration of human judgment into automated workflows / agents.
  • Human Interaction Based on Interruption & Resuming: Long-running systems can be interrupted at any point , request external feedbacks, and seamlessly resume execution with state persistence and recovery.

🔌 Seamless Third-Party Integration

  • Technology-neutral Model Integration: Model integration allows seamlessly integration with any LLM provider through a unified abstraction.
  • Systematic MCP Integration: MCP integration allows your application to connect to MCP servers and to use their tools as first-class workers, in a systematic way in Bridgic.
  • Seamless Enterprise Observability Integration: Support for integrating the leading observability platforms (like Opik, LangWatch) ensures your agentic systems are transparent, debuggable, and optimizable.

🚀 Get Started

This section demonstrates Bridgic's core capabilities through practical examples.

You'll learn how to build an intelligent system with Bridgic, from a simple chatbot to autonomous agentic system. In these cases, you will see features such as worker orchestration, dynamic routing, dynamic topology changing, and parameter resolving.

Part-I examples includes both implementations in normal APIs and ASL, showing how ASL simplifies workflow definition with declarative syntax.

LLM Setup

Before diving into the examples, set up your LLM instance.

import os
from bridgic.llms.openai import OpenAILlm, OpenAIConfiguration

_api_key = os.environ.get("OPENAI_API_KEY")
_api_base = os.environ.get("OPENAI_API_BASE")
_model_name = os.environ.get("OPENAI_MODEL_NAME")

llm = OpenAILlm(
    api_key=_api_key,
    api_base=_api_base,
    configuration=OpenAIConfiguration(model=_model_name),
    timeout=120,
)

Part I: Workflow Orchestration

Each example in this part provides two implementations:

  • the declarative API approach helps you understand how Bridgic works under the hood
  • the ASL approach shows how it simplifies workflow definition with declarative syntax.
<img src="./docs/images/bridgic_api_hierarchy.png" alt="Bridgic API Hierarchy" width="620"/>

Example 1: Build Your First Chatbot Using Bridgic

Core Features:

  • Declare static dependencies between workflow steps
  • Mark start and output workers within the workflow
  • Reuse already implemented Automa components
<details> <summary>Build with Normal API</summary>

View full code

from typing import List, Dict, Optional

from bridgic.core.model.types import Message
from bridgic.core.automa import GraphAutoma, worker, RunningOptions

class DivideConquerWorkflow(GraphAutoma):
    """Break down a query into sub-queries and answer each one."""
    
    @worker(is_start=True)
    async def break_down_query(self, user_input: str) -> List[str]:
        """Break down the query into a list of sub-queries."""
        llm_response = await llm.achat(
            messages=[
                Message.from_text(
                    text="Break down the query into multiple sub-queries and only return the sub-queries",
                    role="system"
                ),
                Message.from_text(text=user_input, role="user"),
            ]
        )
        return [item.strip() for item in llm_response.message.content.split("\n") if item.strip()]

    @worker(dependencies=["break_down_query"], is_output=True)
    async def query_answer(self, queries: List[str]) -> Dict[str, str]:
        """Generate answers for each sub-query."""
        answers = []
        for query in queries:
            response = await llm.achat(
                messages=[
                    Message.from_text(text="Answer the given query briefly", role="system"),
                    Message.from_text(text=query, role="user"),
                ]
            )
            answers.append(response.message.content)
        
        return {
            query: answer
            for query, answer in zip(queries, answers)
        }

class QuestionSolverBot(GraphAutoma):
    """A bot that solves questions by breaking them down and merging answers."""
    
    def __init__(self, name: Optional[str] = None, running_options: Optional[RunningOptions] = None):
        super().__init__(name=name, running_options=running_options)
        # Add DivideConquerWorkflow as a sub-automa
        divide_conquer = DivideConquerWorkflow()
        self.add_worker(
            key="divide_conquer_workflow",
            worker=divide_conquer,
            is_start=True
        )
        # Set dependency: merge_answers depends on divide_conquer_workflow
        self.add_dependency("merge_answers", "divide_conquer_workflow")
    
    @worker(is_output=True)
    async def merge_answers(self, qa_pairs: Dict[str, str], user_input: str) -> str:
        """Merge individual answers into a unified response."""
        ans

Related Skills

View on GitHub
GitHub Stars116
CategoryDevelopment
Updated6d ago
Forks11

Languages

Python

Security Score

100/100

Audited on Mar 17, 2026

No findings