Agentmake
AgentMake AI: a kit for developing agentic AI applications that support 24 AI backends and and work with 7 agentic components, such as tools and agents. (Developer: Eliran Wong) Supported backends: anthropic, azure, azure_any, cohere, custom, deepseek, genai, github, github_any, googleai, groq, llamacpp, mistral, ollama, openai, vertexai, xai
Install / Use
/learn @eliranwong/AgentmakeQuality Score
Category
OperationsSupported Platforms
README
AgentMake AI
AgentMake AI: an agent developement kit (ADK) for developing agentic AI applications that support 18 AI backends and work with 7 agentic components, such as tools and agents. (Developer: Eliran Wong)
Supported backends: anthropic, azure_anthropic, azure_openai, azure_sdk, cohere, custom, deepseek, genai, github, googleai, groq, llamacpp, mistral, ollama, openai, vertexai, xai
Audio Introduction
9-min introduction 24-min introduction
Latest projects
The following two projects are in active development. Both are powered by AgentMake AI and AgentMake AI MCP Servers:
Sibling Projects
This SDK incorporates the best aspects of our favorite projects, LetMeDoIt AI, Toolmate AI and TeamGen AI, to create a library aimed at further advancing the development of agentic AI applications.
The agentmake ecosystem is further extended by two companion projects:
WebUI - agentmakestudio
MCP Servers - agentmakemcp
Supported Platforms
Windows, macOS, Linux, ChromeOS, Android via Termux Terminal and Pixel Terminal
Supported backends
anthropic - Anthropic API [docs]
azure_anthropic - Claude models via Azure Service API [docs]
azure_cohere - Cohere models via Azure Service API [docs]
azure_deepseek - DeepSeek models via Azure Service API [docs]
azure_mistral - Mistral models via Azure Service API [docs]
azure_openai - OpenAI models via Azure Service API [docs]
azure_xai - Grok models viaAzure Service API [docs]
azure_sdk - Other models via Azure AI Inference API [docs]
cohere - Cohere API [docs]
custom - any openai-compatible backends that support function calling
custom1 - any openai-compatible backends that support function calling
custom2 - any openai-compatible backends that support function calling
deepseek - DeepSeek API [docs]
genai - Vertex AI or Google AI [docs]
github - Azure OpenAI Service via Github Token [docs]
github_any - Azure AI Inference via Github Token [docs]
groq - Groq Cloud API [docs]
llamacpp - Llama.cpp Server [docs] - local setup required
mistral - Mistral API [docs]
ollama - Ollama [docs] - local setup required
openai - OpenAI API [docs]
For simplicity, agentmake uses ollama as the default backend, if parameter backend is not specified. Ollama models are automatically downloaded if they have not already been downloaded. Users can change the default backend by modifying environment variable DEFAULT_AI_BACKEND.
Setup Examples
https://github.com/eliranwong/agentmake/tree/main/docs
Introducing Agentic Components
agentmake is designed to work with seven kinds of components for building agentic applications:
-
system- System messages are crucial for defining the roles of the AI agents and guiding how AI agents interact with users. Check out our examples.agentmakesupports the use offabricpatterns assystemcomponents for runningagentmakefunction or CLI options READ HERE. -
instruction- Predefined instructions that are added to users' prompts as prefixes, before they are passed to the AI models. Check out our examples.agentmakesupports the use offabricpatterns asinstructioncomponents for runningagentmakefunction or CLI options READ HERE. -
input_content_plugin- Input content plugins process or transform user inputs before they are passed to the AI models. Check out our examples. -
output_content_plugin- Output content plugins process or transform assistant responses after they are generated by AI models. Check out our examples. -
tool- Tools take simple structured actions in response to users' requests, with the use ofschemaandfunction calling. Check out our examples. -
agent- Agents are agentic applications automate multiple-step actions or decisions, to fulfill complicated requests. They can be executed on their own or integrated into an agentic workflow, supported byagentmake, to work collaboratively with other agents or components. Check out our examples. -
follow_up_prompt- Predefined prompts that are helpful for automating a series of follow-up responses after the first assistant response is generated. Check out our examples.
Built-in and Custom Agentic Components
agentmake supports both built-in agentic components, created by our developers or contributors, and cutoms agentic components, created by users to meet their own needs.
Built-in Agentic Components
Built-in agents components are placed into the following six folders inside the agentmake folders:
agents, instructions, plugins, prompts, systems, tools
To use the built-in components, you only need to specify the component filenames, without parent paths or file extensions, when you run the agentmake signature function or CLI options.
Custom Agentic Components
agentmake offers two options for users to use their custom components.
Option 1: Specify the full file path of inidividual components
Given the fact that each component can be organised as a single file, to use their own custom components, users only need to specify the file paths of the components they want to use, when they run the agentmake signature function or CLI options.
Option 2: Place custom components into agentmake user directory
The default agentmake user directory is ~/agentmake, i.e. a folder named agentmake, created under user's home directory. Uses may define their own path by modifying the environment variable AGENTMAKE_USER_DIR.
After creating a folder named agentmake under user directory, create six sub-folders in it, according to the following names and place your custom components in relevant folders, as we do with our built-in components.
If you organize the custom agentic components in this way, you only need to specify the component filenames, without parent paths or file extensions, when you run the agentmake signature function or CLI options.
Priorities
In cases where a built-in tool and a custom tool have the same name, the custom tool takes priority over the built-in one. This allows for flexibility, enabling users to copy a built-in tool, modify its content, and retain the same name, thereby effectively overriding the built-in tool.
Agentic Application that Built on AgentMake AI
Below are a few examples to illustrate how easy to build agentic applications with AgentMake AI.
Example 1 - ToolMate AI
ToolMate AI version 2.0 is completely built on AgentMake AI, based on the following two agentic workflows, to reolve both complex and simple tasks.
To resolve complex tasks:
<img width="794" alt="Image" src="
Related Skills
tmux
350.8kRemote-control tmux sessions for interactive CLIs by sending keystrokes and scraping pane output.
claude-opus-4-5-migration
110.4kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
async-pr-review
100.5kTrigger this skill when the user wants to start an asynchronous PR review, run background checks on a PR, or check the status of a previously started async PR review.
ci
100.5kCI Replicate & Status This skill enables the agent to efficiently monitor GitHub Actions, triage failures, and bridge remote CI errors to local development. It defaults to automatic replication

