Supercompat
Supercompat allows you to use any AI provider like Anthropic, Groq or Mistral with OpenAI-compatible Assistants API.
Install / Use
/learn @supercorp-ai/SupercompatQuality Score
Category
Development & EngineeringSupported Platforms
README
Supercompat
Use any AI provider with the OpenAI Assistants API
Supercompat is a universal adapter that lets you use OpenAI's Assistants API with any AI provider (Anthropic, Groq, Mistral, Azure, Google, and more). It provides a consistent interface for building AI assistants while allowing you to switch providers seamlessly.
Features
- 🔄 Universal AI Provider Support - Works with OpenAI, Anthropic, Groq, Mistral, Azure, Google, OpenRouter, Perplexity, Together AI, Ollama, and more
- 📦 Flexible Storage - Use Prisma with your own database, OpenAI's Responses API, or Azure AI Agents
- 🔌 Plug-and-Play Architecture - Mix and match client adapters, storage adapters, and run adapters
- 🌊 Streaming Support - Real-time streaming responses for all providers
- 🛠️ Tool Calling - Function calling and code interpreter support across providers
- 📊 Run Steps - Detailed execution steps for debugging and monitoring
- 🔐 Type-Safe - Full TypeScript support with OpenAI's types
Installation
npm install supercompat openai
Depending on which providers you want to use, install the corresponding SDK:
# For OpenAI (already installed above)
# Uses the 'openai' package
# For Azure OpenAI (already installed above)
# Uses the 'openai' package
# For Anthropic
npm install @anthropic-ai/sdk
# For Groq
npm install groq-sdk
# For Mistral
npm install @mistralai/mistralai
# For Azure AI Agents
npm install @azure/ai-projects @azure/identity
# For Google Gemini
npm install @google/genai
# For OpenRouter (access 200+ models via one API)
npm install @openrouter/sdk
# For Perplexity, Together AI, Ollama, etc.
# (These use OpenAI-compatible APIs, no additional SDK needed)
# For Prisma storage
npm install @prisma/client
Quick Start
Basic Setup with Groq and Prisma
import { supercompat, groqClientAdapter, prismaStorageAdapter, completionsRunAdapter } from 'supercompat'
import { PrismaClient } from '@prisma/client'
import Groq from 'groq-sdk'
const prisma = new PrismaClient()
const groq = new Groq({ apiKey: process.env.GROQ_API_KEY })
const client = supercompat({
client: groqClientAdapter({ groq }),
storage: prismaStorageAdapter({ prisma }),
runAdapter: completionsRunAdapter(),
})
// Use it like OpenAI's Assistants API
const thread = await client.beta.threads.create()
const message = await client.beta.threads.messages.create(thread.id, {
role: 'user',
content: 'What is the capital of France?',
})
const run = await client.beta.threads.runs.createAndPoll(thread.id, {
assistant_id: 'your-assistant-id',
})
Architecture
Supercompat uses a modular architecture with three types of adapters that plug into the core:
┌──────────────────────────┐
│ Client Adapter │────┐
│ • Anthropic │ │
│ • Groq │ │
│ • OpenAI │ │
│ • OpenRouter │ │
│ • Mistral, etc. │ │
└──────────────────────────┘ │
│
┌──────────────────────────┐ │ ┌─────────────────┐ ┌──────────────────────┐
│ Storage Adapter │────┼────▶│ │ │ OpenAI Assistants │
│ • Prisma (Database) │ │ │ Supercompat │─────▶│ API Compatible │
│ • Responses API │ │ │ │ │ Interface │
│ • Azure AI Agents │────┘ └─────────────────┘ └──────────────────────┘
└──────────────────────────┘ │
│
┌──────────────────────────┐ │
│ Run Adapter │────┘
│ • completions │
│ • responses │
│ • azureAgents │
└──────────────────────────┘
How it works:
- Client Adapters - Interface with any AI provider (Anthropic, Groq, OpenAI, Mistral, etc.)
- Storage Adapters - Persist data using your preferred backend (Prisma/Database, OpenAI Responses API, Azure AI Agents)
- Run Adapters - Execute runs using different strategies (completions, responses, azureAgents)
You plug all three adapter types into Supercompat, and it exposes an OpenAI Assistants API compatible interface.
Note: In the future, Supercompat will support translating to other API formats beyond OpenAI Assistants API (e.g., Responses API, etc.)
Client Adapters
Client adapters interface with AI provider APIs. Each adapter translates requests to the provider's format.
Available Client Adapters
OpenAI
import { openaiClientAdapter, prismaStorageAdapter, completionsRunAdapter } from 'supercompat'
import { PrismaClient } from '@prisma/client'
import OpenAI from 'openai'
const prisma = new PrismaClient()
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY })
const client = supercompat({
client: openaiClientAdapter({ openai }),
storage: prismaStorageAdapter({ prisma }),
runAdapter: completionsRunAdapter(),
})
Anthropic (Claude)
import { anthropicClientAdapter, prismaStorageAdapter, completionsRunAdapter } from 'supercompat'
import { PrismaClient } from '@prisma/client'
import Anthropic from '@anthropic-ai/sdk'
const prisma = new PrismaClient()
const anthropic = new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY })
const client = supercompat({
client: anthropicClientAdapter({ anthropic }),
storage: prismaStorageAdapter({ prisma }),
runAdapter: completionsRunAdapter(),
})
Supports native Anthropic tool calling including:
- Web search (
web_search_20241111) - Code execution (
code_execution_20241022) - Computer use (
computer_20241022)
Groq
import { groqClientAdapter, prismaStorageAdapter, completionsRunAdapter } from 'supercompat'
import { PrismaClient } from '@prisma/client'
import Groq from 'groq-sdk'
const prisma = new PrismaClient()
const groq = new Groq({ apiKey: process.env.GROQ_API_KEY })
const client = supercompat({
client: groqClientAdapter({ groq }),
storage: prismaStorageAdapter({ prisma }),
runAdapter: completionsRunAdapter(),
})
Mistral
import { mistralClientAdapter, prismaStorageAdapter, completionsRunAdapter } from 'supercompat'
import { PrismaClient } from '@prisma/client'
import { Mistral } from '@mistralai/mistralai'
const prisma = new PrismaClient()
const mistral = new Mistral({ apiKey: process.env.MISTRAL_API_KEY })
const client = supercompat({
client: mistralClientAdapter({ mistral }),
storage: prismaStorageAdapter({ prisma }),
runAdapter: completionsRunAdapter(),
})
Azure OpenAI
import { azureOpenaiClientAdapter, prismaStorageAdapter, completionsRunAdapter } from 'supercompat'
import { PrismaClient } from '@prisma/client'
import { AzureOpenAI } from 'openai'
const prisma = new PrismaClient()
const azureOpenai = new AzureOpenAI({
apiKey: process.env.AZURE_OPENAI_API_KEY,
endpoint: process.env.AZURE_OPENAI_ENDPOINT,
apiVersion: '2024-02-15-preview',
})
const client = supercompat({
client: azureOpenaiClientAdapter({ azureOpenai }),
storage: prismaStorageAdapter({ prisma }),
runAdapter: completionsRunAdapter(),
})
Azure AI Agents
Use Azure AI Foundry's native Agents API:
import { azureAiProjectClientAdapter, azureAgentsStorageAdapter, azureAgentsRunAdapter, supercompat } from 'supercompat'
import { AIProjectClient } from '@azure/ai-projects'
import { ClientSecretCredential } from '@azure/identity'
import { PrismaClient } from '@prisma/client'
const credential = new ClientSecretCredential(
process.env.AZURE_TENANT_ID!,
process.env.AZURE_CLIENT_ID!,
process.env.AZURE_CLIENT_SECRET!
)
const azureAiProject = new AIProjectClient(
process.env.AZURE_PROJECT_ENDPOINT!,
credential
)
const prisma = new PrismaClient()
const runAdapter = azureAgentsRunAdapter({ azureAiProject })
const client = supercompat({
client: azureAiProjectClientAdapter({ azureAiProject }),
storage: azureAgentsStorageAdapter({ azureAiProject, prisma }),
runAdapter,
})
Azure Setup:
To use Azure AI Agents, you need to:
-
Create an Azure AI Foundry Project in the Azure Portal
-
Create a Service Principal (App Registration):
az ad sp create-for-rbac --name "supercompat-app" --role Contributor \ --scopes /subscriptions/{subscription-id}/resourceGroups/{resource-group}/providers/Microsoft.CognitiveServices/accounts/{ai-project} -
Assign the "Cognitive Services User" role to the service principal:
- Go to your AI Project in Azure Portal
- Navigate to "Access control (IAM)"
- Click "Add role assignment"
- Select "Cognitive Services User" role
- Select your service principal
- Save
-
Set environment variables:
AZURE_PROJECT_ENDPOINT=https://your-project.cognitiveservices.azure.com/ AZURE_TENANT_ID=your-tenant-id AZURE_CLIENT_ID=your-client-id AZURE_CLIENT_SECRET=your-client-secret
Google Gemini
import { googleClientAdapter, prismaStorageAdapter, completionsRunAdapter } from 'supercompat'
import { PrismaClient } from '@prisma/client'
import { GoogleGenAI } from '@google/genai'
const prisma = new PrismaClient()
const google = new GoogleGenAI({ apiKey: process.env.GOOGLE_API_KEY })
const client = supercompat({
client: googleClientAdapter({ google }),
storage: prismaStorageAdapter({ prisma }),
runAdapter: completionsRunAdapter(),
})
Supports computer use (computer_use_preview) via the native Gemini SDK with automatic coordinate denormalization.
OpenRouter
Access 200+ models (Gemini, DeepSeek, Qwen, Grok, MiniMax, Kimi, GLM, and more) through a single API:
import { openRouterClientAdapter, prismaStorageAdapter, completionsRunAdapter } from 'supercompat'
import { PrismaClient } from '@prisma/client'
import { OpenRouter } from '@openrouter/sdk'
const prisma = new PrismaClient()
const openRouter = new OpenRouter({
apiKey: process.env.OPENRO
