Engram
MCP server for AI memory -- hybrid search (BM25 + semantic + knowledge graph), temporal decay, local-first
Install / Use
/learn @199-biotechnologies/EngramQuality Score
Category
Development & EngineeringSupported Platforms
README
Why This Exists
You tell your AI something important. A name, an allergy, a deadline. Next conversation -- it's forgotten. You repeat yourself. You re-explain context. You carry the cognitive load that your AI should carry for you.
Engram gives your AI a real memory system. Tell it once:
"My colleague Sarah is allergic to shellfish and prefers window seats. She's leading the Q1 product launch."
Weeks later, ask:
"I'm booking a team lunch and flights for the offsite -- what should I know?"
Engram connects the dots. It remembers Sarah, the allergy, the seating preference, the workload. Your AI suggests restaurants without shellfish, books her a window seat, and flags that she's probably swamped with the launch.
This is not keyword matching. It is understanding.
An engram is a unit of cognitive information imprinted in a physical substance -- the biological basis of memory.
Install
npm install -g @199-bio/engram
Requires Node.js 18+.
Quick Start
With Claude Desktop (or any MCP desktop client)
Add to your MCP config (~/Library/Application Support/Claude/claude_desktop_config.json):
{
"mcpServers": {
"engram": {
"command": "npx",
"args": ["-y", "@199-bio/engram"],
"env": {
"ANTHROPIC_API_KEY": "sk-ant-..."
}
}
}
}
With Claude Code
claude mcp add engram -- npx -y @199-bio/engram
That's it. Your AI now remembers.
How It Works
Engram runs three search methods in parallel and fuses the results:
┌─────────────────┐
│ Your Query │
└────────┬────────┘
│
┌───────────────┼───────────────┐
│ │ │
▼ ▼ ▼
┌──────────┐ ┌──────────┐ ┌──────────────┐
│ BM25 │ │ Semantic │ │ Knowledge │
│ Keyword │ │Embedding │ │ Graph │
│ Search │ │ Search │ │ Lookup │
└────┬─────┘ └────┬─────┘ └──────┬───────┘
│ │ │
└──────────────┼─────────────────┘
│
┌─────────▼─────────┐
│ Reciprocal Rank │
│ Fusion │
└─────────┬─────────┘
│
┌─────────▼─────────┐
│ Temporal Decay │
│ + Salience Score │
└─────────┬─────────┘
│
┌─────────▼─────────┐
│ Ranked Results │
└───────────────────┘
BM25 finds exact keyword matches for names and phrases via SQLite FTS5.
Semantic search finds conceptually related content using Jina v5 embeddings with MLX Metal acceleration (~9ms/query on Apple Silicon).
Knowledge graph expands results through entity relationships -- ask about Sarah and her company, projects, and preferences all surface together.
Results are merged with Reciprocal Rank Fusion, then scored by temporal decay (Ebbinghaus forgetting curve) and salience. Fresh memories surface first. Important memories resist fading.
Features
Memory That Feels Real
Things fade. A memory from six months ago that you never revisited becomes harder to find. But important things -- a name, a birthday, a preference -- stay accessible even as time passes.
Recall strengthens. Every time a memory surfaces, it becomes more permanent. The things you think about often are the things your AI won't forget.
Everything connects. People link to places, places to events, events to details. The knowledge graph keeps your world coherent.
MCP Tools
Your AI gets these capabilities through the Model Context Protocol:
| Tool | What It Does |
|------|-------------|
| remember | Store new information with importance and emotional weight |
| recall | Find relevant memories ranked by relevance and recency |
| forget | Remove a specific memory |
| create_entity | Add a person, place, or concept to the knowledge graph |
| observe | Record a fact about an entity |
| relate | Connect two entities (e.g., "works at", "married to") |
| query_entity | Get everything known about someone or something |
| list_entities | See all tracked entities |
| stats | View memory statistics |
| consolidate | Compress old memories and detect contradictions |
| engram_web | Launch a visual memory browser |
Memory Consolidation
With an API key, Engram compresses old memories -- like sleep turning experiences into long-term storage:
- Groups related low-importance memories
- Creates AI-generated summaries (digests)
- Flags contradictory information
- Archives the originals
Storage stays lean, but nothing important gets lost.
Privacy
Your memories stay on your machine. Everything lives in ~/.engram/. The only external call is optional -- if you provide an API key, Engram can compress old memories into summaries. Core functionality works offline.
Performance
On M1 MacBook Air:
| Operation | Time | |-----------|------| | Remember | ~100ms | | Recall | ~50ms | | Graph queries | ~5ms | | Consolidate | ~2-5s per batch |
Storage: ~1KB per memory.
Configuration
Environment variables:
| Variable | Purpose | Default |
|----------|---------|---------|
| ENGRAM_DB_PATH | Where to store data | ~/.engram/ |
| ANTHROPIC_API_KEY | Enable memory consolidation | None (optional) |
| COLBERT_MODEL | ColBERT reranking model | colbert-ir/colbertv2.0 |
| EMBEDDING_MODEL | Embedding model for semantic search | Qwen/Qwen3-Embedding-0.6B |
| MAX_MEMORY_CACHE | In-memory cache size | 1000 |
| RETRIEVAL_TOP_K | Initial retrieval pool size | 50 |
| RERANK_TOP_K | Final results after reranking | 10 |
| ENGRAM_TRANSPORT | Transport mode (stdio or http) | stdio |
| PORT | HTTP port for remote deployment | 3000 |
Building from Source
git clone https://github.com/199-biotechnologies/engram.git
cd engram
npm install
npm run build
npm install -g .
For semantic search with local embeddings:
pip install jina-grep
This uses Jina v5 embeddings with MLX Metal acceleration (~9ms/query). If unavailable, Engram falls back to keyword-only search.
Roadmap
- [x] Hybrid search (BM25 + semantic embeddings)
- [x] Knowledge graph with entity relationships
- [x] Memory decay and strengthening (Ebbinghaus curve)
- [x] Consolidation with contradiction detection
- [x] Web interface for visual memory browsing
- [ ] Export and import
- [ ] Scheduled consolidation
Contributing
Contributions are welcome. See CONTRIBUTING.md for guidelines.
License
MIT -- Copyright (c) 2025 Boris Djordjevic, 199 Biotechnologies
<p align="center"> Built by <a href="https://github.com/longevityboris">Boris Djordjevic</a> at <a href="https://github.com/199-biotechnologies">199 Biotechnologies</a> | <a href="https://paperfoot.ai">Paperfoot AI</a> </p> <p align="center"> <a href="https://github.com/199-biotechnologies/engram/stargazers"> <img src="https://img.shields.io/github/stars/199-biotechnologies/engram?style=for-the-badge&logo=github&label=%E2%AD%90%20Star%20this%20repo&color=yellow" alt="GitHub Stars" /> </a> <a href="https://x.com/longevityboris"> <img src="https://img.shields.io/badge/Follow_%40longevityboris-000000?style=for-the-badge&logo=x&logoColor=white" alt="Follow on X" /> </a> </p>
