SkillAgentSearch skills...

Lynkr

Streamline your workflow with Lynkr, a CLI tool that acts as an HTTP proxy for efficient code interactions using Claude Code CLI.

Install / Use

/learn @Fast-Editor/Lynkr
About this skill

Quality Score

0/100

Supported Platforms

Claude Code
Claude Desktop
Cursor

README

Lynkr - Run Cursor, Cline, Continue, OpenAi Compatible Tools and Claude Code on any model.

One universal LLM proxy for AI coding tools.

npm version Homebrew Tap License: Apache 2.0 Ask DeepWiki Databricks Supported AWS Bedrock OpenAI Compatible Ollama Compatible llama.cpp Compatible

Use Case

        Cursor / Cline / Continue / Claude Code / Clawdbot / Codex/ KiloCode
                        ↓
                       Lynkr
                        ↓
        Local LLMs | OpenRouter | Azure | Databricks | AWS BedRock | Ollama | LMStudio | Gemini

Overview

Lynkr is a self-hosted proxy server that unlocks Claude Code CLI , Cursor IDE and Codex Cli by enabling:

  • 🚀 Any LLM Provider - Databricks, AWS Bedrock (100+ models), OpenRouter (100+ models), Ollama (local), llama.cpp, Azure OpenAI, Azure Anthropic, OpenAI, LM Studio
  • 💰 60-80% Cost Reduction - Built-in token optimization with smart tool selection, prompt caching, and memory deduplication
  • 🔒 100% Local/Private - Run completely offline with Ollama or llama.cpp
  • 🌐 Remote or Local - Connect to providers on any IP/hostname (not limited to localhost)
  • 🎯 Zero Code Changes - Drop-in replacement for Anthropic's backend
  • 🏢 Enterprise-Ready - Circuit breakers, load shedding, Prometheus metrics, health checks

Perfect for:

  • Developers who want provider flexibility and cost control
  • Enterprises needing self-hosted AI with observability
  • Privacy-focused teams requiring local model execution
  • Teams seeking 60-80% cost reduction through optimization

Quick Start

Installation

Option 1: NPM Package (Recommended)

# Install globally
npm install -g pino-pretty 
npm install -g lynkr

lynkr start

Option 2: Git Clone

# Clone repository
git clone https://github.com/vishalveerareddy123/Lynkr.git
cd Lynkr

# Install dependencies
npm install

# Create .env from example
cp .env.example .env

# Edit .env with your provider credentials
nano .env

# Start server
npm start

Node.js Compatibility:

  • Node 20-24: Full support with all features
  • Node 25+: Full support (native modules auto-rebuild, babel fallback for code parsing)

Option 3: Docker

docker-compose up -d

Supported Providers

Lynkr supports 10+ LLM providers:

| Provider | Type | Models | Cost | Privacy | |----------|------|--------|------|---------| | AWS Bedrock | Cloud | 100+ (Claude, Titan, Llama, Mistral, etc.) | $$-$$$ | Cloud | | Databricks | Cloud | Claude Sonnet 4.5, Opus 4.5 | $$$ | Cloud | | OpenRouter | Cloud | 100+ (GPT, Claude, Llama, Gemini, etc.) | $-$$ | Cloud | | Ollama | Local | Unlimited (free, offline) | FREE | 🔒 100% Local | | llama.cpp | Local | GGUF models | FREE | 🔒 100% Local | | Azure OpenAI | Cloud | GPT-4o, GPT-5, o1, o3 | $$$ | Cloud | | Azure Anthropic | Cloud | Claude models | $$$ | Cloud | | OpenAI | Cloud | GPT-4o, o1, o3 | $$$ | Cloud | | LM Studio | Local | Local models with GUI | FREE | 🔒 100% Local | | MLX OpenAI Server | Local | Apple Silicon (M1/M2/M3/M4) | FREE | 🔒 100% Local |

📖 Full Provider Configuration Guide


Claude Code Integration

Configure Claude Code CLI to use Lynkr:

# Set Lynkr as backend
export ANTHROPIC_BASE_URL=http://localhost:8081
export ANTHROPIC_API_KEY=dummy

# Run Claude Code
claude "Your prompt here"

That's it! Claude Code now uses your configured provider.

📖 Detailed Claude Code Setup


Cursor Integration

Configure Cursor IDE to use Lynkr:

  1. Open Cursor Settings

    • Mac: Cmd+, | Windows/Linux: Ctrl+,
    • Navigate to: FeaturesModels
  2. Configure OpenAI API Settings

    • API Key: sk-lynkr (any non-empty value)
    • Base URL: http://localhost:8081/v1
    • Model: claude-3.5-sonnet (or your provider's model)
  3. Test It

    • Chat: Cmd+L / Ctrl+L
    • Inline edits: Cmd+K / Ctrl+K
    • @Codebase search: Requires embeddings setup

📖 Full Cursor Setup Guide | Embeddings Configuration

Codex CLI Integration

Configure OpenAI Codex CLI to use Lynkr as its backend.

Option 1: Environment Variables (Quick Start)

export OPENAI_BASE_URL=http://localhost:8081/v1
export OPENAI_API_KEY=dummy

codex

Option 2: Config File (Recommended)

Edit ~/.codex/config.toml:

# Set Lynkr as the default provider
model_provider = "lynkr"
model = "gpt-4o"

# Define the Lynkr provider
[model_providers.lynkr]
name = "Lynkr Proxy"
base_url = "http://localhost:8081/v1"
wire_api = "responses"

# Optional: Trust your project directories
[projects."/path/to/your/project"]
trust_level = "trusted"

Configuration Options

| Option | Description | Example | |--------|-------------|---------| | model_provider | Active provider name | "lynkr" | | model | Model to request (mapped by Lynkr) | "gpt-4o", "claude-sonnet-4-5" | | base_url | Lynkr endpoint | "http://localhost:8081/v1" | | wire_api | API format (responses or chat) | "responses" | | trust_level | Project trust (trusted, sandboxed) | "trusted" |

Remote Lynkr Server

To connect Codex to a remote Lynkr instance:

[model_providers.lynkr-remote]
name = "Remote Lynkr"
base_url = "http://192.168.1.100:8081/v1"
wire_api = "responses"

Troubleshooting

| Issue | Solution | |-------|----------| | Same response for all queries | Disable semantic cache: SEMANTIC_CACHE_ENABLED=false | | Tool calls not executing | Increase threshold: POLICY_TOOL_LOOP_THRESHOLD=15 | | Slow first request | Keep Ollama loaded: OLLAMA_KEEP_ALIVE=24h | | Connection refused | Ensure Lynkr is running: npm start |

Note: Codex uses the OpenAI Responses API format. Lynkr automatically converts this to your configured provider's format.


ClawdBot Integration

Lynkr supports ClawdBot via its OpenAI-compatible API. ClawdBot users can route requests through Lynkr to access any supported provider.

Configuration in ClawdBot: | Setting | Value | |---------|-------| | Model/auth provider | Copilot | | Copilot auth method | Copilot Proxy (local) | | Copilot Proxy base URL | http://localhost:8081/v1 | | Model IDs | Any model your Lynkr provider supports |

Available models (depending on your Lynkr provider): gpt-5.2, gpt-5.1-codex, claude-opus-4.5, claude-sonnet-4.5, claude-haiku-4.5, gemini-3-pro, gemini-3-flash, and more.

🌐 Remote Support: ClawdBot can connect to Lynkr on any machine - use any IP/hostname in the Proxy base URL (e.g., http://192.168.1.100:8081/v1 or http://gpu-server:8081/v1).


Lynkr also supports Cline, Continue.dev and other OpenAI compatible tools.


Documentation

Getting Started

IDE & CLI Integration

Features & Capabilities

Deployment & Operations

Support


External Resources

View on GitHub
GitHub Stars394
CategoryDevelopment
Updated1d ago
Forks43

Languages

JavaScript

Security Score

100/100

Audited on Mar 27, 2026

No findings