Mindwave
[WIP] 🧠 Toolkit for building AI features into your Laravel app.
Install / Use
/learn @HelgeSverre/MindwaveREADME

Mindwave: Production AI Utilities for Laravel
The working developer's AI toolkit - Long prompts, streaming, tracing, and context discovery made simple.
v1.0.0 Released - All 4 pillars complete with 1300+ tests.
Experimental - This package is under active development. APIs may change. Use in production at your own risk.
What is Mindwave?
Mindwave is a Laravel package that provides production-grade AI utilities for building LLM-powered features. Unlike complex agent frameworks, Mindwave focuses on practical tools that Laravel developers actually need:
- ✅ Auto-fit long prompts to any model's context window
- ✅ Stream LLM responses with 3 lines of code (SSE/EventSource)
- ✅ OpenTelemetry tracing with database storage for costs, tokens, and performance
- ✅ Ad-hoc context discovery from your database/CSV using TNTSearch
Why Mindwave?
Not another agent framework. Just batteries-included utilities for shipping AI features fast.
// Write long prompts, Mindwave auto-fits to model limits
Mindwave::prompt()
->section('system', $instructions)
->section('context', $largeDocument, priority: 50, shrinker: 'summarize')
->section('user', $question)
->fit() // Auto-trims to context window
->run();
// Stream responses in 3 lines (backend)
return Mindwave::stream($prompt)->respond();
// View traces and costs
$traces = MindwaveTrace::expensive(0.10)->with('spans')->get();
// Pull context from your DB on-the-fly
Mindwave::prompt()
->context(TntSearchSource::fromEloquent(User::query(), fn($u) => "Name: {$u->name}"))
->ask('Who has Laravel expertise?');
Installation
Install via Composer:
composer require mindwave/mindwave
Publish the config files:
php artisan vendor:publish --tag="mindwave-config"
Run migrations for tracing (optional but recommended):
php artisan migrate
Quick Start
1. Basic LLM Chat
use Mindwave\Mindwave\Facades\Mindwave;
$response = Mindwave::llm()->chat([
['role' => 'system', 'content' => 'You are a helpful assistant.'],
['role' => 'user', 'content' => 'Explain Laravel in one sentence.'],
]);
echo $response->content;
2. Streaming Responses
Backend:
use Mindwave\Mindwave\Facades\Mindwave;
Route::get('/chat', function (Request $request) {
return Mindwave::stream($request->input('message'))
->model('gpt-4')
->respond();
});
Frontend:
const stream = new EventSource('/chat?message=' + encodeURIComponent(question));
stream.onmessage = e => output.textContent += e.data;
stream.addEventListener('done', () => stream.close());
3. Auto-Fit Long Prompts
use Mindwave\Mindwave\Facades\Mindwave;
// Automatically handles token limits
Mindwave::prompt()
->reserveOutputTokens(500)
->section('system', 'You are an expert analyst', priority: 100)
->section('documentation', $longDocContent, priority: 50, shrinker: 'summarize')
->section('history', $conversationHistory, priority: 75)
->section('user', $userQuestion, priority: 100)
->fit() // Trims to model's context window
->run();
4. View Costs & Traces
use Mindwave\Mindwave\Observability\Models\Trace;
use Mindwave\Mindwave\Observability\Models\Span;
// Find expensive traces
$expensive = Trace::where('estimated_cost', '>', 0.10)
->with('spans')
->orderByDesc('created_at')
->get();
// Find slow LLM calls
$slow = Span::where('operation_name', 'chat')
->where('duration', '>', 5_000_000_000) // 5 seconds in nanoseconds
->with('trace')
->get();
// Daily cost summary
$dailyCosts = Trace::selectRaw('
DATE(created_at) as date,
COUNT(*) as total_traces,
SUM(estimated_cost) as total_cost,
SUM(total_input_tokens) as input_tokens,
SUM(total_output_tokens) as output_tokens
')
->groupBy('date')
->orderByDesc('date')
->get();
5. Ad-Hoc Context Discovery
use Mindwave\Mindwave\Context\Sources\TntSearchSource;
// Search your database on-the-fly
Mindwave::prompt()
->context(
TntSearchSource::fromEloquent(
Product::where('active', true),
fn($p) => "Product: {$p->name}, Price: {$p->price}"
)
)
->ask('What products under $50 do you have?');
// Or from CSV files
Mindwave::prompt()
->context(TntSearchSource::fromCsv('data/knowledge-base.csv'))
->ask('How do I reset my password?');
Core Features
🧩 Prompt Composer
Automatically manage context windows with priority-based section trimming:
- Token budgeting - Reserve tokens for output, auto-fit sections
- Smart shrinkers - Summarize, truncate, or compress content
- Priority system - Keep important sections, trim less critical ones
- Multi-model support - Works with GPT-4, Claude, Mistral, etc.
🌊 Streaming (SSE)
Production-ready Server-Sent Events streaming:
- 3-line setup - Backend and frontend
- Proper headers - Works with Nginx/Apache out of the box
- Connection monitoring - Handles client disconnects
- Error handling - Graceful failure and retry
📊 OpenTelemetry Tracing
Industry-standard observability with GenAI semantic conventions:
- Automatic tracing - All LLM calls tracked (zero configuration)
- Database storage - Query traces via Eloquent models
- OTLP export - Send to Jaeger, Grafana, Datadog, Honeycomb, etc.
- Cost tracking - Automatic cost estimation per call
- Token usage - Input/output/total tokens tracked
- PII protection - Configurable message capture and redaction
- Artisan commands - Export, prune, and analyze traces
Quick Start:
// 1. Enable tracing in .env
// MINDWAVE_TRACING_ENABLED=true
// 2. LLM calls are automatically traced
$response = Mindwave::llm()->generateText('Hello!');
// 3. Query traces
use Mindwave\Mindwave\Observability\Models\Trace;
$expensive = Span::where('cost_usd', '>', 0.10)
->orderBy('cost_usd', 'desc')
->get();
📖 Complete Tracing Guide - Querying, cost analysis, custom spans, OTLP setup
📐 Architecture Documentation - Technical deep dive
🔍 TNTSearch Context Discovery
Pull context from your application data without complex RAG setup:
- No infrastructure - Pure PHP, no external services
- Multiple sources - Eloquent, arrays, CSV files, VectorStores
- Fast indexing - Ephemeral indexes with automatic cleanup
- BM25 ranking - Industry-standard relevance scoring
- Auto-query extraction - Automatically extracts search terms from user messages
- OpenTelemetry tracing - Track search performance and results
Quick Example:
use Mindwave\Mindwave\Context\Sources\TntSearch\TntSearchSource;
use Mindwave\Mindwave\Context\ContextPipeline;
// Search Eloquent models
$userSource = TntSearchSource::fromEloquent(
User::where('active', true),
fn($u) => "Name: {$u->name}, Skills: {$u->skills}"
);
// Search CSV files
$docsSource = TntSearchSource::fromCsv('data/knowledge-base.csv');
// Combine multiple sources
$pipeline = (new ContextPipeline)
->addSource($userSource)
->addSource($docsSource)
->deduplicate() // Remove duplicates
->rerank(); // Sort by relevance
// Use in prompt (query auto-extracted from user message)
Mindwave::prompt()
->context($pipeline, limit: 5)
->section('user', 'Who has Laravel expertise?')
->run();
📖 Complete Context Discovery Guide - All source types, pipelines, advanced features
Configuration
LLM Configuration
// config/mindwave-llm.php
return [
'default' => env('MINDWAVE_LLM_DRIVER', 'openai'),
'llms' => [
'openai' => [
'api_key' => env('OPENAI_API_KEY'),
'model' => env('OPENAI_MODEL', 'gpt-4-turbo'),
'max_tokens' => 4096,
'temperature' => 0.7,
],
'mistral' => [
'api_key' => env('MISTRAL_API_KEY'),
'model' => env('MISTRAL_MODEL', 'mistral-large-latest'),
],
],
];
Tracing Configuration
// config/mindwave-tracing.php
return [
'enabled' => env('MINDWAVE_TRACING_ENABLED', true),
'database' => [
'enabled' => true, // Store in database
],
'otlp' => [
'enabled' => env('MINDWAVE_TRACE_OTLP_ENABLED', false),
'endpoint' => env('OTEL_EXPORTER_OTLP_ENDPOINT', 'http://localhost:4318'),
],
'capture_messages' => false, // PII protection
'retention_days' => 30,
];
Artisan Commands
# Export traces to CSV/JSON
php artisan mindwave:export-traces --since=yesterday --format=csv
# Prune old traces
php artisan mindwave:prune-traces --older-than=30days
# View trace statistics
php artisan mindwave:trace-stats
# View TNTSearch index statistics
php artisan mindwave:index-stats
# Clear old TNTSearch indexes (default: 24 hours)
php artisan mindwave:clear-indexes
