Swiftide
Fast, streaming indexing, query, and agentic LLM applications in Rust
Install / Use
/learn @bosun-ai/SwiftideREADME
- What is Swiftide?
- Latest updates on our blog :fire:
- Examples
- Vision
- Features
- Getting Started
- Usage and concepts
- Contributing
- Core Team Members
- License
<a name="readme-top"></a>
<!-- PROJECT SHIELDS --> <!-- *** I'm using markdown "reference style" links for readability. *** Reference links are enclosed in brackets [ ] instead of parentheses ( ). *** See the bottom of this document for the declaration of the reference variables *** for contributors-url, forks-url, etc. This is an optional, concise syntax you may use. *** https://www.markdownguide.org/basic-syntax/#reference-style-links -->
[![Crate Badge]][Crate]
[![Docs Badge]][API Docs]
[![Contributors][contributors-shield]][contributors-url]
[![Stargazers][stars-shield]][stars-url]
[![MIT License][license-shield]][license-url]
[![LinkedIn][linkedin-shield]][linkedin-url]
What is Swiftide?
<!-- [![Product Name Screen Shot][product-screenshot]](https://example.com) -->Swiftide is a Rust library for building LLM applications. From performing a simple prompt completion, to building fast, streaming indexing and querying pipelines, to building agents that can use tools and call other agents.
High level features
- Simple primitives for common LLM tasks
- Build fast, streaming indexing and querying pipelines
- Easily build agents, mix and match with previously built pipelines
- A modular and extendable API, with minimal abstractions
- Integrations with popular LLMs and storage providers
- Ready to use pipeline transformations or bring your own
- Build graph like workflows with Tasks
- Langfuse support
Part of the bosun.ai project. An upcoming platform for autonomous code improvement.
We <3 feedback: project ideas, suggestions, and complaints are very welcome. Feel free to open an issue or contact us on discord.
<p align="right">(<a href="#readme-top">back to top</a>)</p>[!CAUTION] Swiftide is under heavy development and can have breaking changes. Documentation might fall short of all features, and despite our efforts be slightly outdated. We recommend to always keep an eye on our github and api documentation. If you found an issue or have any kind of feedback we'd love to hear from you.
Latest updates on our blog :fire:
- Swiftide 0.31 - Tasks, Langfuse, Multi-Modal, and more
- Swiftide 0.27 - Easy human-in-the-loop flows for agentic AI
- Swiftide 0.26 - Streaming agents
- Releasing kwaak with kwaak
- Swiftide 0.16 - AI Agents in Rust
- Rust in LLM based tools for performance
- Evaluate Swiftide pipelines with Ragas (2024-09-15)
- Release - Swiftide 0.12 (2024-09-13)
- Local code intel with Ollama, FastEmbed and OpenTelemetry (2024-09-04)
More on our blog
<p align="right">(<a href="#readme-top">back to top</a>)</p>Examples
Indexing a local code project, chunking into smaller pieces, enriching the nodes with metadata, and persisting into Qdrant:
indexing::Pipeline::from_loader(FileLoader::new(".").with_extensions(&["rs"]))
.with_default_llm_client(openai_client.clone())
.filter_cached(Redis::try_from_url(
redis_url,
"swiftide-examples",
)?)
.then_chunk(ChunkCode::try_for_language_and_chunk_size(
"rust",
10..2048,
)?)
.then(MetadataQACode::default())
.then(move |node| my_own_thing(node))
.then_in_batch(Embed::new(openai_client.clone()))
.then_store_with(
Qdrant::builder()
.batch_size(50)
.vector_size(1536)
.build()?,
)
.run()
.await?;
Querying for an example on how to use the query pipeline:
query::Pipeline::default()
.then_transform_query(GenerateSubquestions::from_client(
openai_client.clone(),
))
.then_transform_query(Embed::from_client(
openai_client.clone(),
))
.then_retrieve(qdrant.clone())
.then_answer(Simple::from_client(openai_client.clone()))
.query("How can I use the query pipeline in Swiftide?")
.await?;
Running an agent that can search code:
#[swiftide::tool(
description = "Searches code",
param(name = "code_query", description = "The code query")
)]
async fn search_code(
context: &dyn AgentContext,
code_query: &str,
) -> Result<ToolOutput, ToolError> {
let command_output = context
.executor()
.exec_cmd(&Command::shell(format!("rg '{code_query}'")))
.await?;
Ok(command_output.into())
}
agents::Agent::builder()
.llm(&openai)
.tools(vec![search_code()])
.build()?
.query("In what file can I find an example of a swiftide agent?")
.await?;
Agents loop over LLM calls, tool calls, and lifecycle hooks until a final answer is reached.
You can find more detailed examples in /examples
<p align="right">(<a href="#readme-top">back to top</a>)</p>Vision
Our goal is to create a fast, extendable platform for building LLM applications in Rust, to further the development of automated AI applications, with an easy-to-use and easy-to-extend api.
<p align="right">(<a href="#readme-top">back to top</a>)</p>Features
- Simple primitives for common LLM tasks
- Fast, modular streaming indexing pipeline with async, parallel processing
- Experimental query pipeline
- Experimental agent framework
- A variety of loaders, transformers, semantic chunkers, embedders, and more
- Bring your own transformers by extending straightforward traits or use a closure
- Splitting and merging pipelines
- Jinja-like templating for prompts
- Store into multiple backends
- Integrations with OpenAI, Groq, Gemini, Anthropic, Redis, Qdrant, Ollama, FastEmbed-rs, Fluvio, LanceDB, and Treesitter
- Evaluate pipelines with RAGAS
- Sparse vector support for hybrid search
tracingsupported for logging and tracing, see /examples and thetracingcrate for more information.- Tracing layer for exporting to Langfuse
In detail
| Feature | Details | | -------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Supported Large Language Model providers | OpenAI (and Azure) <br> Anthropic <br> Gemini <br> OpenRouter <br> AWS Bedrock (Converse API) <br> Groq - All models <br> Ollama - All models | | Agents | All the boiler plate for autonomous agents so you don't have to | | Tasks | Build graph like workflows with tasks, combining all the above to build complex applications | | Loading data | Files <br> Scraping <br> Fluvio <br> Parquet <br> Kafka <br> Other pipelines and streams | | Example and pre-build transformers and metadata generation |
