SkillAgentSearch skills...

Cortexgraph

Temporal memory system for AI assistants with human-like forgetting curves. All data stored locally in human-readable formats: JSONL for short-term memory, Markdown (Obsidian-compatible) for long-term. Memories naturally decay unless reinforced. Features knowledge graphs, smart prompting, and MCP server integration for Claude.

Install / Use

/learn @prefrontal-systems/Cortexgraph

README

CortexGraph: Temporal Memory for AI

<!-- mcp-name: io.github.prefrontal-systems/cortexgraph -->

A Model Context Protocol (MCP) server providing human-like memory dynamics for AI assistants. Memories naturally fade over time unless reinforced through use, mimicking the Ebbinghaus forgetting curve.

License: AGPL-3.0 Python 3.10+ Tests Security Scanning codecov SBOM: CycloneDX

[!NOTE] About the Name & Version

This project was originally developed as mnemex (published to PyPI up to v0.6.0). In November 2025, it was transferred to Prefrontal Systems and renamed to CortexGraph to better reflect its role within a broader cognitive architecture for AI systems.

Version numbering starts at 0.1.0 for the cortexgraph package to signal a fresh start under the new name, while acknowledging the mature, well-tested codebase (791 tests, 98%+ coverage) inherited from mnemex. The mnemex package remains frozen at v0.6.0 on PyPI.

This versioning approach:

  • Signals "new package" to PyPI users discovering cortexgraph
  • Gives room to evolve the brand, API, and organizational integration before 1.0
  • Maintains continuity: users can migrate from pip install mnemexpip install cortexgraph
  • Reflects that while the code is mature, the cortexgraph identity is just beginning

[!IMPORTANT] 🔬 RESEARCH ARTIFACT - NOT FOR PRODUCTION

This software is a Proof of Concept (PoC) and reference implementation for research purposes. It exists to validate theoretical frameworks in cognitive architecture and AI safety (specifically the STOPPER Protocol and CortexGraph).

It is NOT a commercial product. It is not maintained for general production use, may contain breaking changes, and offers no guarantees of stability or support. Use it to study the concepts, but build your own production implementations.

📖 New to this project? Start with the ELI5 Guide for a simple explanation of what this does and how to use it.

What is CortexGraph?

CortexGraph gives AI assistants like Claude a human-like memory system.

The Problem

When you chat with Claude, it forgets everything between conversations. You tell it "I prefer TypeScript" or "I'm allergic to peanuts," and three days later, you have to repeat yourself. This is frustrating and wastes time.

What CortexGraph Does

CortexGraph makes AI assistants remember things naturally, just like human memory:

  • 🧠 Remembers what matters - Your preferences, decisions, and important facts
  • Forgets naturally - Old, unused information fades away over time (like the Ebbinghaus forgetting curve)
  • 💪 Gets stronger with use - The more you reference something, the longer it's remembered
  • 📦 Saves important things permanently - Frequently used memories get promoted to long-term storage

How It Works (Simple Version)

  1. You talk naturally - "I prefer dark mode in all my apps"
  2. Memory is saved automatically - No special commands needed
  3. Time passes - Memory gradually fades if not used
  4. You reference it again - "Make this app dark mode"
  5. Memory gets stronger - Now it lasts even longer
  6. Important memories promoted - Used 5+ times? Saved permanently to your Obsidian vault

No flashcards. No explicit review. Just natural conversation.

Why It's Different

Most memory systems are dumb:

  • ❌ "Delete after 7 days" (doesn't care if you used it 100 times)
  • ❌ "Keep last 100 items" (throws away important stuff just because it's old)

CortexGraph is smart:

  • ✅ Combines recency (when?), frequency (how often?), and importance (how critical?)
  • ✅ Memories fade naturally like human memory
  • ✅ Frequently used memories stick around longer
  • ✅ You can mark critical things to "never forget"

Technical Overview

This repository contains research, design, and a complete implementation of a short-term memory system that combines:

  • Novel temporal decay algorithm based on cognitive science
  • Reinforcement learning through usage patterns
  • Two-layer architecture (STM + LTM) for working and permanent memory
  • Smart prompting patterns for natural LLM integration
  • Git-friendly storage with human-readable JSONL
  • Knowledge graph with entities and relations

Module Organization

CortexGraph follows a modular architecture:

  • cortexgraph.core: Foundational algorithms (decay, similarity, clustering, consolidation, search validation)
  • cortexgraph.agents: Multi-agent consolidation pipeline and storage utilities
  • cortexgraph.storage: JSONL and SQLite storage backends with batch operations
  • cortexgraph.tools: MCP tool implementations

Why CortexGraph?

🔒 Privacy & Transparency

All data stored locally on your machine - no cloud services, no tracking, no data sharing.

  • Short-term memory:

    • JSONL (default): Human-readable, git-friendly files (~/.config/cortexgraph/jsonl/)
    • SQLite: Robust database storage for larger datasets (~/.config/cortexgraph/cortexgraph.db)
  • Long-term memory: Markdown files optimized for Obsidian

    • YAML frontmatter with metadata
    • Wikilinks for connections
    • Permanent storage you control
  • Export: Built-in utility to export memories to Markdown for portability.

You own your data. You can read it, edit it, delete it, or version control it - all without any special tools.

Core Algorithm

The temporal decay scoring function:

$$ \Large \text{score}(t) = (n_{\text{use}})^\beta \cdot e^{-\lambda \cdot \Delta t} \cdot s $$

Where:

  • $\large n_{\text{use}}$ - Use count (number of accesses)
  • $\large \beta$ (beta) - Sub-linear use count weighting (default: 0.6)
  • $\large \lambda = \frac{\ln(2)}{t_{1/2}}$ (lambda) - Decay constant; set via half-life (default: 3-day)
  • $\large \Delta t$ - Time since last access (seconds)
  • $\large s$ - Strength parameter $\in [0, 2]$ (importance multiplier)

Thresholds:

  • $\large \tau_{\text{forget}}$ (default 0.05) — if score < this, forget
  • $\large \tau_{\text{promote}}$ (default 0.65) — if score ≥ this, promote (or if $\large n_{\text{use}}\ge5$ in 14 days)

Decay Models:

  • Power‑Law (default): heavier tail; most human‑like retention
  • Exponential: lighter tail; forgets sooner
  • Two‑Component: fast early forgetting + heavier tail

See detailed parameter reference, model selection, and worked examples in docs/scoring_algorithm.md.

Tuning Cheat Sheet

  • Balanced (default)
    • Half-life: 3 days (λ ≈ 2.67e-6)
    • β = 0.6, τ_forget = 0.05, τ_promote = 0.65, use_count≥5 in 14d
    • Strength: 1.0 (bump to 1.3–2.0 for critical)
  • High‑velocity context (ephemeral notes, rapid switching)
    • Half-life: 12–24 hours (λ ≈ 1.60e-5 to 8.02e-6)
    • β = 0.8–0.9, τ_forget = 0.10–0.15, τ_promote = 0.70–0.75
  • Long retention (research/archival)
    • Half-life: 7–14 days (λ ≈ 1.15e-6 to 5.73e-7)
    • β = 0.3–0.5, τ_forget = 0.02–0.05, τ_promote = 0.50–0.60
  • Preference/decision heavy assistants
    • Half-life: 3–7 days; β = 0.6–0.8
    • Strength defaults: 1.3–1.5 for preferences; 1.8–2.0 for decisions
  • Aggressive space control
    • Raise τ_forget to 0.08–0.12 and/or shorten half-life; schedule weekly GC
  • Environment template
    • CORTEXGRAPH_DECAY_LAMBDA=2.673e-6, CORTEXGRAPH_DECAY_BETA=0.6
    • CORTEXGRAPH_FORGET_THRESHOLD=0.05, CORTEXGRAPH_PROMOTE_THRESHOLD=0.65
    • CORTEXGRAPH_PROMOTE_USE_COUNT=5, CORTEXGRAPH_PROMOTE_TIME_WINDOW=14

Decision thresholds:

  • Forget: $\text{score} < 0.05$ → delete memory
  • Promote: $\text{score} \geq 0.65$ OR $n_{\text{use}} \geq 5$ within 14 days → move to LTM

Key Innovations

1. Temporal Decay with Reinforcement

Unlike traditional caching (TTL, LRU), Mnemex scores memories continuously by combining recency (exponential decay), frequency (sub-linear use count), and importance (adjustable strength). See Core Algorithm for the mathematical formula. This creates memory dynamics that closely mimic human cognition.

2. Smart Prompting System + Natural Language Activation (v0.6.0+)

Patterns for making AI assistants use memory naturally, now enhanced with automatic entity extraction and importance scoring:

Auto-Enrichment (NEW in v0.6.0)

When you save memories, CortexGraph automatically:

  • Extracts entities (people, technologies, organizations) using spaCy NER
  • Calculates importance/strength based on content markers
  • Detects save/recall intent from natural language phrases
# Before v0.6.0 - manual entity specification
save_memory(content="Use JWT for auth", entities=["JWT", "auth"])

# v0.6.0+ - automatic extraction
save_memory(content="Use JWT for auth")
# Entities auto-extracted: ["jwt", "auth"]
# Strength auto-calculated based on content

Auto-Save

User: "Remember: I prefer TypeScript over JavaScript"
→ Detected save phrase: "Remember"
→ Automatically saved with:
   - Entities
View on GitHub
GitHub Stars33
CategoryDevelopment
Updated3d ago
Forks7

Languages

HTML

Security Score

95/100

Audited on Mar 31, 2026

No findings