SkillAgentSearch skills...

Swiftide

Fast, streaming indexing, query, and agentic LLM applications in Rust

Install / Use

/learn @bosun-ai/Swiftide
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

<details> <summary>Table of Contents</summary> <!--toc:start--> <!--toc:end--> </details>

<a name="readme-top"></a>

<!-- PROJECT SHIELDS --> <!-- *** I'm using markdown "reference style" links for readability. *** Reference links are enclosed in brackets [ ] instead of parentheses ( ). *** See the bottom of this document for the declaration of the reference variables *** for contributors-url, forks-url, etc. This is an optional, concise syntax you may use. *** https://www.markdownguide.org/basic-syntax/#reference-style-links -->

CI Coverage Status [![Crate Badge]][Crate] [![Docs Badge]][API Docs] [![Contributors][contributors-shield]][contributors-url] [![Stargazers][stars-shield]][stars-url] Discord [![MIT License][license-shield]][license-url] [![LinkedIn][linkedin-shield]][linkedin-url]

<!-- PROJECT LOGO --> <br /> <div align="center"> <a href="https://github.com/bosun-ai/swiftide"> <img src="https://raw.githubusercontent.com/bosun-ai/swiftide/master/images/logo.png" alt="Logo" width="250" height="250"> </a> <h3 align="center">Swiftide</h3> <p align="center"> Fast, streaming indexing, query, and agentic LLM applications in Rust <br /> <a href="https://swiftide.rs"><strong>Read more on swiftide.rs »</strong></a> <br /> <br /> <!-- <a href="https://github.com/bosun-ai/swiftide">View Demo</a> --> <a href="https://docs.rs/swiftide/latest/swiftide/">API Docs</a> · <a href="https://github.com/bosun-ai/swiftide/issues/new?labels=bug&template=bug_report.md">Report Bug</a> · <a href="https://github.com/bosun-ai/swiftide/issues/new?labels=enhancement&template=feature_request.md">Request Feature</a> · <a href="https://discord.gg/3jjXYen9UY">Discord</a> </p> </div> <!-- ABOUT THE PROJECT --> <p align="right">(<a href="#readme-top">back to top</a>)</p>

What is Swiftide?

<!-- [![Product Name Screen Shot][product-screenshot]](https://example.com) -->

Swiftide is a Rust library for building LLM applications. From performing a simple prompt completion, to building fast, streaming indexing and querying pipelines, to building agents that can use tools and call other agents.

High level features

  • Simple primitives for common LLM tasks
  • Build fast, streaming indexing and querying pipelines
  • Easily build agents, mix and match with previously built pipelines
  • A modular and extendable API, with minimal abstractions
  • Integrations with popular LLMs and storage providers
  • Ready to use pipeline transformations or bring your own
  • Build graph like workflows with Tasks
  • Langfuse support
<div align="center"> <img src="https://raw.githubusercontent.com/bosun-ai/swiftide/master/images/overview.png" alt="Swiftide overview" width="100%" > </div>

Part of the bosun.ai project. An upcoming platform for autonomous code improvement.

We <3 feedback: project ideas, suggestions, and complaints are very welcome. Feel free to open an issue or contact us on discord.

[!CAUTION] Swiftide is under heavy development and can have breaking changes. Documentation might fall short of all features, and despite our efforts be slightly outdated. We recommend to always keep an eye on our github and api documentation. If you found an issue or have any kind of feedback we'd love to hear from you.

<p align="right">(<a href="#readme-top">back to top</a>)</p>

Latest updates on our blog :fire:

More on our blog

<p align="right">(<a href="#readme-top">back to top</a>)</p>

Examples

Indexing a local code project, chunking into smaller pieces, enriching the nodes with metadata, and persisting into Qdrant:

indexing::Pipeline::from_loader(FileLoader::new(".").with_extensions(&["rs"]))
        .with_default_llm_client(openai_client.clone())
        .filter_cached(Redis::try_from_url(
            redis_url,
            "swiftide-examples",
        )?)
        .then_chunk(ChunkCode::try_for_language_and_chunk_size(
            "rust",
            10..2048,
        )?)
        .then(MetadataQACode::default())
        .then(move |node| my_own_thing(node))
        .then_in_batch(Embed::new(openai_client.clone()))
        .then_store_with(
            Qdrant::builder()
                .batch_size(50)
                .vector_size(1536)
                .build()?,
        )
        .run()
        .await?;

Querying for an example on how to use the query pipeline:

query::Pipeline::default()
    .then_transform_query(GenerateSubquestions::from_client(
        openai_client.clone(),
    ))
    .then_transform_query(Embed::from_client(
        openai_client.clone(),
    ))
    .then_retrieve(qdrant.clone())
    .then_answer(Simple::from_client(openai_client.clone()))
    .query("How can I use the query pipeline in Swiftide?")
    .await?;

Running an agent that can search code:

#[swiftide::tool(
    description = "Searches code",
    param(name = "code_query", description = "The code query")
)]
async fn search_code(
    context: &dyn AgentContext,
    code_query: &str,
) -> Result<ToolOutput, ToolError> {
    let command_output = context
        .executor()
        .exec_cmd(&Command::shell(format!("rg '{code_query}'")))
        .await?;

    Ok(command_output.into())
}

agents::Agent::builder()
    .llm(&openai)
    .tools(vec![search_code()])
    .build()?
    .query("In what file can I find an example of a swiftide agent?")
    .await?;

Agents loop over LLM calls, tool calls, and lifecycle hooks until a final answer is reached.

You can find more detailed examples in /examples

<p align="right">(<a href="#readme-top">back to top</a>)</p>

Vision

Our goal is to create a fast, extendable platform for building LLM applications in Rust, to further the development of automated AI applications, with an easy-to-use and easy-to-extend api.

<p align="right">(<a href="#readme-top">back to top</a>)</p>

Features

  • Simple primitives for common LLM tasks
  • Fast, modular streaming indexing pipeline with async, parallel processing
  • Experimental query pipeline
  • Experimental agent framework
  • A variety of loaders, transformers, semantic chunkers, embedders, and more
  • Bring your own transformers by extending straightforward traits or use a closure
  • Splitting and merging pipelines
  • Jinja-like templating for prompts
  • Store into multiple backends
  • Integrations with OpenAI, Groq, Gemini, Anthropic, Redis, Qdrant, Ollama, FastEmbed-rs, Fluvio, LanceDB, and Treesitter
  • Evaluate pipelines with RAGAS
  • Sparse vector support for hybrid search
  • tracing supported for logging and tracing, see /examples and the tracing crate for more information.
  • Tracing layer for exporting to Langfuse

In detail

| Feature | Details | | -------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Supported Large Language Model providers | OpenAI (and Azure) <br> Anthropic <br> Gemini <br> OpenRouter <br> AWS Bedrock (Converse API) <br> Groq - All models <br> Ollama - All models | | Agents | All the boiler plate for autonomous agents so you don't have to | | Tasks | Build graph like workflows with tasks, combining all the above to build complex applications | | Loading data | Files <br> Scraping <br> Fluvio <br> Parquet <br> Kafka <br> Other pipelines and streams | | Example and pre-build transformers and metadata generation |

View on GitHub
GitHub Stars686
CategoryDevelopment
Updated9h ago
Forks58

Languages

Rust

Security Score

100/100

Audited on Apr 10, 2026

No findings