Forge
Forge is a self-hosted middleware that unifies access to multiple AI model providers (like OpenAI, Anthropic) via a single API. It supports OpenAI-compatible interfaces, encrypted API key management, and easy integration into existing tools and frontends. Built for security, scalability, and extensibility.
Install / Use
/learn @TensorBlock/ForgeQuality Score
Category
Customer SupportSupported Platforms
README
Forge
<div align="center"> <h3>One API for all AI models</h3> <p> <a href="https://www.producthunt.com/products/tensorblock-forge?embed=true&utm_source=badge-top-post-badge&utm_medium=badge&utm_source=badge-tensorblock-forge" target="_blank"><img src="https://api.producthunt.com/widgets/embed-image/v1/top-post-badge.svg?post_id=985244&theme=light&period=daily&t=1752002891028" alt="TensorBlock Forge - One API for all AI models | Product Hunt" style="width: 250px; height: 54px;" width="250" height="54" /></a> </p> <p> <a href="#key-features">Features</a> • <a href="#installation">Installation</a> • <a href="#usage">Usage</a> • <a href="#configuration">Configuration</a> • <a href="#security-and-privacy">Security</a> • <a href="#contributing">Contributing</a> • <a href="#license">License</a> </p> </div>Introduction
Forge is an open-source middleware service that simplifies AI model provider management. It allows you to use multiple AI providers (OpenAI, Anthropic, etc.) through a single, unified API. By storing your provider API keys securely, Forge generates a unified key that works across all your AI applications.

News
- We have deployed Forge as an online service, feel free to ✨ learn the service and 🚀 try it now
- Forge now enables Claude Code to work with any LLM
Why Forge?
- Unified Experience: Use multiple AI models through a single, consistent API
- Simplified Key Management: Store provider keys once, use everywhere
- Seamless Integration: Compatible with OpenAI API standard for easy integration with existing tools
- Enhanced Security: Keys are encrypted and never exposed to end applications
Developer Quickstart
Make your first API request in minutes. Learn the basics of the Forge platform.
cURL Example
curl https://api.forge.tensorblock.co/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $FORGE_API_KEY" \
-d '{
"model": "OpenAI/gpt-4o",
"messages": [
{
"role": "developer",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "Hello!"
}
]
}'
Python Example
from openai import OpenAI
client = OpenAI(
base_url="https://api.forge.tensorblock.co/v1",
api_key=FORGE_API_KEY,
)
# models = client.models.list()
completion = client.chat.completions.create(
model="OpenAI/gpt-4o",
messages=[
{"role": "developer", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
]
)
print(completion.choices[0].message)
JavaScript/Node.js Example
const { OpenAI } = require("openai");
const client = new OpenAI({
baseURL: "https://api.forge.tensorblock.co/v1",
apiKey: FORGE_API_KEY,
});
// const models = await openai.models.list();
async function main() {
const completion = await client.chat.completions.create({
model: "OpenAI/gpt-4o",
messages: [
{ role: "developer", content: "You are a helpful assistant." },
{ role: "user", content: "Hello!" }
]
});
console.log(completion.choices[0].message);
}
main();
Key Features
- Unified API Key: Store multiple provider API keys and access all with a single Forge API key
- OpenAI API Compatible: Drop-in replacement for any application that uses OpenAI's API
- Model Mapping: Create custom model names mapped to provider-specific models
- Advanced Security: Strong encryption for API keys with JWT-based authentication
- High Performance: Built for scalability and concurrent usage
- CLI Management: Easy key and user management via included command-line interface
- Extensible Architecture: Provider adapter pattern makes it easy to add new AI providers
Installation
Prerequisites
- Python 3.12 or newer
- UV package manager (installed automatically by the setup scripts)
The project uses UV as the preferred package manager for faster and more reliable dependency installation.
Quick Start
# Clone the repository
git clone https://github.com/yourusername/forge.git
cd forge
# Make sure you have a running PostgreSQL instance.
# You can start one locally using Docker:
# docker compose up -d db
#
# Your DATABASE_URL for the local Docker setup will be:
# DATABASE_URL="postgresql://forge:forge@localhost:5432/forge"
# Create and configure your environment file
cp .env.example .env
# Now, edit the .env file with your specific settings.
# If you used the docker compose command above, you can use the DATABASE_URL provided.
# The `DATABASE_URL` should point to your running PostgreSQL instance.
# Example: DATABASE_URL="postgresql://user:password@localhost/mydatabase"
# Run the setup script
# For Linux/macOS:
chmod +x setup.sh
./setup.sh
# For Windows:
setup.bat
# Start the server
python run.py
Using Docker
# First, create your environment file from the example.
# The Docker container will automatically use this .env file for configuration.
cp .env.example .env
# Now, edit the .env file with your specific settings.
# Build and run with Docker Compose
docker compose up -d
Manual Setup
If you prefer to set up manually:
# After cloning the repo, create your environment file
cp .env.example .env
# Now, edit the .env file with your specific settings.
# Install UV package manager
curl -LsSf https://astral.sh/uv/install.sh | sh # Linux/macOS
# or
pip install uv # Windows
# Create a virtual environment
uv venv venv --python=python3.12
source venv/bin/activate # Linux/macOS
# or
venv\Scripts\activate # Windows
# Install dependencies
uv pip install -e .
uv pip install --dev -e .
# Run migrations
alembic upgrade head
# Start the server
python run.py
See the detailed installation guide for step-by-step instructions.
Usage
Managing Forge with CLI
Forge comes with a command-line interface for easy management:
# Run in interactive mode
./forge-cli.py
# Or use specific commands
./forge-cli.py register --username myuser --email user@example.com
./forge-cli.py login --username myuser
./forge-cli.py add-key --provider openai --api-key sk-...
./forge-cli.py test --model gpt-4o --message "Hello, AI!"
See all available commands:
./forge-cli.py --help
API Documentation
API documentation is available at /docs when the server is running.
Integration
Connecting Frontends
To use Forge with frontends like CherryStudio, LobeChat, or any OpenAI-compatible application:
- Register and add your provider API keys in Forge
- Configure your frontend to use Forge's URL and your Forge API key
- Use standard model names (e.g.,
gpt-4o,claude-sonnet-4) or custom mappings
Frontend ➡️ Forge ➡️ AI Provider API
Supported Providers
Forge supports an extensive range of AI providers through both custom adapters and OpenAI-compatible interfaces:
- OpenAI
- Anthropic
- Google Gemini
- xAI
- DeepSeek
- Cohere
- Mistral
- Nvidia
- Alibaba
- Fireworks AI
- Azure OpenAI
- AWS Bedrock
- Together AI
- OpenRouter
- Cerebras
- Groq
- SambaNova
- Moonshot
- Hunyuan
- Baichuan
- Stepfun
- 01.ai
- Nebius
- Novita
- NScale
- DeepInfra
- Maritaca
- Featherless.ai
- Enfer
- Inference.net
- Kluster.ai
- Lambda
- Mancer
- Redpill.ai
- Parasail
- Nineteen.ai
- Targon
- Hyperbolic
- SiliconFlow
- TensorBlock
- Perplexity
- Zhipu
The Anthropic integration includes:
- Support for Claude 3 models with streaming responses
- Automatic model name mapping (e.g., "claude" → "claude-3-opus-20240229")
- Message API format support
- Token usage reporting
Configuration
Environment Variables
Key environment variables in your .env file, please check .env.example:
# Server settings
HOST=127.0.0.1
PORT=8000
# Database
DATABASE_URL=postgresql://user:password@localhost:5432/forge
# Security
API_KEY_ENCRYPTION_KEY=your_generated_key
JWT_SECRET_KEY=your_generated_key
# Secret key for signing JWTs. Generate a strong, random key.
# Use `openssl rand -hex 32` to generate a key.
SECRET_KEY=
# Secret key for encrypting sensitive data like provider API keys.
# Generate with: python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())"
ENCRYPTION_KEY=
Model Mapping
Create custom aliases for provider models:
{
"my-smart-model": "gpt-4o",
"my-fast-model": "claude-instant"
}
Security and Privacy
At Forge, we take the security and privacy of your API keys very seriously. Since our service requires storing provider API keys (like OpenAI, Anthropic, etc.), we've implemented multiple layers of protection:
