SkillAgentSearch skills...

Mcphost

A CLI host application that enables Large Language Models (LLMs) to interact with external tools through the Model Context Protocol (MCP).

Install / Use

/learn @mark3labs/Mcphost
About this skill

Quality Score

0/100

Supported Platforms

Claude Code
Cursor

README

MCPHost 🤖

A CLI host application that enables Large Language Models (LLMs) to interact with external tools through the Model Context Protocol (MCP). Currently supports Claude, OpenAI, Google Gemini, and Ollama models.

Discuss the Project on Discord

Table of Contents

Overview 🌟

MCPHost acts as a host in the MCP client-server architecture, where:

  • Hosts (like MCPHost) are LLM applications that manage connections and interactions
  • Clients maintain 1:1 connections with MCP servers
  • Servers provide context, tools, and capabilities to the LLMs

This architecture allows language models to:

  • Access external tools and data sources 🛠️
  • Maintain consistent context across interactions 🔄
  • Execute commands and retrieve information safely 🔒

Currently supports:

  • Anthropic Claude models (Claude 3.5 Sonnet, Claude 3.5 Haiku, etc.)
  • OpenAI models (GPT-4, GPT-4 Turbo, GPT-3.5, etc.)
  • Google Gemini models (Gemini 2.0 Flash, Gemini 1.5 Pro, etc.)
  • Any Ollama-compatible model with function calling support
  • Any OpenAI-compatible API endpoint

Features ✨

  • Interactive conversations with multiple AI models
  • Non-interactive mode for scripting and automation
  • Script mode for executable YAML-based automation scripts
  • Support for multiple concurrent MCP servers
  • Tool filtering with allowedTools and excludedTools per server
  • Dynamic tool discovery and integration
  • Tool calling capabilities across all supported models
  • Configurable MCP server locations and arguments
  • Consistent command interface across model types
  • Configurable message history window for context management
  • OAuth authentication support for Anthropic (alternative to API keys)
  • Hooks system for custom integrations and security policies
  • Environment variable substitution in configs and scripts
  • Builtin servers for common functionality (filesystem, bash, todo, http)

Requirements 📋

  • Go 1.23 or later
  • For OpenAI/Anthropic: API key for the respective provider
  • For Ollama: Local Ollama installation with desired models
  • For Google/Gemini: Google API key (see https://aistudio.google.com/app/apikey)
  • One or more MCP-compatible tool servers

Environment Setup 🔧

  1. API Keys:
# For all providers (use --provider-api-key flag or these environment variables)
export OPENAI_API_KEY='your-openai-key'        # For OpenAI
export ANTHROPIC_API_KEY='your-anthropic-key'  # For Anthropic
export GOOGLE_API_KEY='your-google-key'        # For Google/Gemini
  1. Ollama Setup:
  • Install Ollama from https://ollama.ai
  • Pull your desired model:
ollama pull mistral
  • Ensure Ollama is running:
ollama serve

You can also configure the Ollama client using standard environment variables, such as OLLAMA_HOST for the Ollama base URL.

  1. Google API Key (for Gemini):
export GOOGLE_API_KEY='your-api-key'
  1. OpenAI Compatible Setup:
  • Get your API server base URL, API key and model name
  • Use --provider-url and --provider-api-key flags or set environment variables
  1. Self-Signed Certificates (TLS): If your provider uses self-signed certificates (e.g., local Ollama with HTTPS), you can skip certificate verification:
mcphost --provider-url https://192.168.1.100:443 --tls-skip-verify

⚠️ WARNING: Only use --tls-skip-verify for development or when connecting to trusted servers with self-signed certificates. This disables TLS certificate verification and is insecure for production use.

Installation 📦

go install github.com/mark3labs/mcphost@latest

SDK Usage 🛠️

MCPHost also provides a Go SDK for programmatic access without spawning OS processes. The SDK maintains identical behavior to the CLI, including configuration loading, environment variables, and defaults.

Quick Example

package main

import (
    "context"
    "fmt"
    "github.com/mark3labs/mcphost/sdk"
)

func main() {
    ctx := context.Background()
    
    // Create MCPHost instance with default configuration
    host, err := sdk.New(ctx, nil)
    if err != nil {
        panic(err)
    }
    defer host.Close()
    
    // Send a prompt and get response
    response, err := host.Prompt(ctx, "What is 2+2?")
    if err != nil {
        panic(err)
    }
    
    fmt.Println(response)
}

SDK Features

  • ✅ Programmatic access without spawning processes
  • ✅ Identical configuration behavior to CLI
  • ✅ Session management (save/load/clear)
  • ✅ Tool execution callbacks for monitoring
  • ✅ Streaming support
  • ✅ Full compatibility with all providers and MCP servers

For detailed SDK documentation, examples, and API reference, see the SDK README.

Configuration ⚙️

MCP Servers

MCPHost will automatically create a configuration file in your home directory if it doesn't exist. It looks for config files in this order:

  • .mcphost.yml or .mcphost.json (preferred)
  • .mcp.yml or .mcp.json (backwards compatibility)

Config file locations by OS:

  • Linux/macOS: ~/.mcphost.yml, ~/.mcphost.json, ~/.mcp.yml, ~/.mcp.json
  • Windows: %USERPROFILE%\.mcphost.yml, %USERPROFILE%\.mcphost.json, %USERPROFILE%\.mcp.yml, %USERPROFILE%\.mcp.json

You can also specify a custom location using the --config flag.

Environment Variable Substitution

MCPHost supports environment variable substitution in both config files and script frontmatter using the syntax:

  • ${env://VAR} - Required environment variable (fails if not set)
  • ${env://VAR:-default} - Optional environment variable with default value

This allows you to keep sensitive information like API keys in environment variables while maintaining flexible configuration.

Example:

mcpServers:
  github:
    type: local
    command: ["docker", "run", "-i", "--rm", "-e", "GITHUB_PERSONAL_ACCESS_TOKEN=${env://GITHUB_TOKEN}", "ghcr.io/github/github-mcp-server"]
    environment:
      DEBUG: "${env://DEBUG:-false}"
      LOG_LEVEL: "${env://LOG_LEVEL:-info}"

model: "${env://MODEL:-anthropic/claude-sonnet-4-5-20250929}"
provider-api-key: "${env://OPENAI_API_KEY}"  # Required - will fail if not set

Usage:

# Set required environment variables
export GITHUB_TOKEN="ghp_your_token_here"
export OPENAI_API_KEY="your_openai_key"

# Optionally override defaults
export DEBUG="true"
export MODEL="openai/gpt-4"

# Run mcphost
mcphost

Simplified Configuration Schema

MCPHost now supports a simplified configuration schema with three server types:

Local Servers

For local MCP servers that run commands on your machine:

{
  "mcpServers": {
    "filesystem": {
      "type": "local",
      "command": ["npx", "@modelcontextprotocol/server-filesystem", "${env://WORK_DIR:-/tmp}"],
      "environment": {
        "DEBUG": "${env://DEBUG:-false}",
        "LOG_LEVEL": "${env://LOG_LEVEL:-info}",
        "API_TOKEN": "${env://FS_API_TOKEN}"
      },
      "allowedTools": ["read_file", "write_file"],
      "excludedTools": ["delete_file"]
    },
    "github": {
      "type": "local",
      "command": ["docker", "run", "-i", "--rm", "-e", "GITHUB_PERSONAL_ACCESS_TOKEN=${env://GITHUB_TOKEN}", "ghcr.io/github/github-mcp-server"],
      "environment": {
        "DEBUG": "${env://DEBUG:-false}"
      }
    },
    "sqlite": {
      "type": "local",
      "command": ["uvx", "mcp-server-sqlite", "--db-path", "${env://DB_PATH:-/tmp/foo.db}"],
      "environment": {
        "SQLITE_DEBUG": "${env://DEBUG:-0}",
        "DATABASE_URL": "${env://DATABASE_URL:-sqlite:///tmp/foo.db}"
      }
    }
  }
}

Each local server entry requires:

  • type: Must be set to "local"
  • command: Array containing the command and all its arguments
  • environment: (Optional) Object with environment variables as key-value pairs
  • allowedTools: (Optional) Array of tool names to include (whitelist)
  • excludedTools: (Optional) Array of tool names to exclude (blacklist)

Remote Servers

For remote MCP servers accessible via HTTP:

{
  "mcpServers": {
    "websearch": {
      "type": "remote",
      "url": "${env://WEBSEARCH_URL:-https://api.example.com/mcp}",
      "headers": ["Authorization: Bearer ${env://WEBSEARCH_TOKEN}"]
    },
    "weather": {
      "type": "remote", 
      "url": "${env://WEATHER_URL:-https://weather-mcp.example.com}"
    }
  }
}

Each remote server entry requires:

  • type: Must be set to "remote"
  • url: The URL where the MCP server is accessible
  • headers: (Optional) Array of HTTP headers for authentication and custom headers

Remote servers automatically use the StreamableHTTP transport for optimal performance.

View on GitHub
GitHub Stars1.6k
CategoryDevelopment
Updated8h ago
Forks230

Languages

Go

Security Score

95/100

Audited on Apr 3, 2026

No findings