StixDB
A memory layer for AI agents that organizes itself
Install / Use
/learn @Pr0fe5s0r/StixDBREADME
Install
pip install "stixdb-engine[local-dev]"
Get Started
1. Configure (run once — saves to ~/.stixdb/config.json):
stixdb init
2. Start the server:
stixdb daemon start
3. Ingest something:
stixdb ingest ./docs/ -c my_project
4. Ask a question:
stixdb ask "What did I learn about the auth system?" -c my_project
That's it. StixDB is now running in the background, organizing your memories automatically.
Command Reference
Setup
| Command | Description |
|---|---|
| stixdb init | Configure StixDB globally (~/.stixdb/config.json) |
| stixdb init --local | Configure per-project (.stixdb/config.json in current directory) |
| stixdb info | Show the active configuration |
| stixdb status | Ping the running server |
stixdb init # walk through the setup wizard
stixdb info # show LLM, storage, port, and key settings
stixdb status # check if the server is up
Server
Run StixDB as a foreground process or a background daemon.
Foreground
stixdb serve # start on the configured port (default 4020)
stixdb serve --port 4321 # custom port
Daemon (background)
| Command | Description |
|---|---|
| stixdb daemon start | Start the server in the background |
| stixdb daemon stop | Stop the background server |
| stixdb daemon restart | Restart the background server |
| stixdb daemon status | Check if the daemon is running |
| stixdb daemon logs | View server logs |
stixdb daemon start
stixdb daemon status
stixdb daemon logs
stixdb daemon restart
stixdb daemon stop
Ingest
Load files or entire directories into a collection.
stixdb ingest ./docs/ # ingest a folder
stixdb ingest ./notes.md # ingest a single file
stixdb ingest ./src/ -c proj_myapp # specify a collection
stixdb ingest ./data/ -c proj_myapp --tags source-code,docs
Options:
| Option | Default | Description |
|---|---|---|
| -c, --collection | from config | Target collection name |
| --tags | — | Comma-separated tags to attach to ingested nodes |
| --chunk-size | 600 | Characters per chunk |
| --chunk-overlap | 150 | Overlap between consecutive chunks |
| --importance | 0.7 | Base importance score (0.0–1.0) |
StixDB respects .gitignore and skips node_modules, .git, and binary files automatically.
Store
Write a single memory node directly.
stixdb store "Alice is the lead engineer on the payments team." -c my_project
stixdb store "Decided to use KuzuDB for persistent storage." -c my_project --tags decisions --importance 0.9
Options:
| Option | Default | Description |
|---|---|---|
| -c, --collection | from config | Target collection |
| --tags | — | Comma-separated tags |
| --importance | 0.7 | Importance score (0.0–1.0) |
| --node-type | fact | Node type (fact, summary, etc.) |
Search
Semantic search — fast, no LLM, returns ranked matches.
stixdb search "auth middleware" # search default collection
stixdb search "database decisions" -c proj_myapp # search a named collection
stixdb search "in progress" -c proj_myapp --top-k 10
Options:
| Option | Default | Description |
|---|---|---|
| -c, --collection | from config | Collection to search |
| --top-k | 10 | Number of results to return |
| --depth | 1 | Graph expansion depth (higher = more context) |
| --threshold | 0.25 | Minimum similarity score (0.0–1.0) |
Ask
Ask a natural-language question. The agent retrieves relevant memory, reasons over it, and returns a cited answer.
stixdb ask "What was I working on last session?"
stixdb ask "What decisions have been made about storage?" -c proj_myapp
stixdb ask "Summarize all known bugs" -c proj_myapp --top-k 20 --depth 3
Options:
| Option | Default | Description |
|---|---|---|
| -c, --collection | from config | Collection to query |
| --top-k | 15 | Nodes to retrieve before reasoning |
| --depth | 2 | Graph traversal depth |
| --threshold | 0.25 | Minimum similarity score |
| --thinking | 1 | Reasoning steps (1 = single-pass, 2+ = multi-hop) |
| --hops | 4 | Retrieval hops per thinking step |
When to use ask vs search:
- Use
searchwhen you want specific facts back quickly (no LLM cost). - Use
askwhen you need the answer synthesised across multiple memories.
Graph Viewer
Open an interactive visual graph of a collection in your browser.
stixdb graph # view default collection
stixdb graph proj_myapp # view a named collection
stixdb graph proj_myapp --viewer-port 8080 # custom local port
stixdb graph proj_myapp --no-browser # print URL only
<p align="center">
<img src="assets/example_graph_evoled.png" alt="Example of an Evolved StixDB Graph" width="800" />
</p>
The viewer starts a local server (default port 4021) and opens your browser. Node colour = memory tier. Node size = importance. Click any node to read its full content.
Options:
| Option | Default | Description |
|---|---|---|
| --viewer-port, -p | 4021 | Local port for the graph viewer |
| --no-browser | false | Print URL without opening the browser |
Collections
Manage your collections.
stixdb collections list # list all collections
stixdb collections stats my_project # node/edge counts and tier breakdown
stixdb collections dedupe my_project # remove duplicate chunks
stixdb collections dedupe my_project --dry-run # preview duplicates without deleting
stixdb collections delete my_project # delete a collection (irreversible)
stixdb collections delete my_project --yes # skip confirmation prompt
Configuration
StixDB is configured once with stixdb init. Settings are stored in ~/.stixdb/config.json and read on every command.
To override settings for a single project, run stixdb init --local inside that directory. Local config takes priority over global.
Environment Variables
You can also configure StixDB via environment variables or a .env file:
# LLM
STIXDB_LLM_PROVIDER=openai # openai | anthropic | ollama | none
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
# Storage
STIXDB_STORAGE_MODE=kuzu # memory | kuzu | neo4j
STIXDB_KUZU_PATH=./my_db
# Server
STIXDB_API_PORT=4020
STIXDB_API_KEY=your-secret-key # optional — enables auth on the REST API
# Background agent
STIXDB_AGENT_CYCLE_INTERVAL=30.0 # seconds between background cycles
STIXDB_AGENT_CONSOLIDATION_THRESHOLD=0.88
STIXDB_AGENT_DECAY_HALF_LIFE=48.0 # hours
STIXDB_AGENT_PRUNE_THRESHOLD=0.05
Storage Backends
| Backend | Persistence | Install | Best For |
|---|---|---|---|
| In-Memory | Lost on restart | Included | Testing, prototypes |
| KuzuDB | On-disk | pip install "stixdb-engine[local-dev]" | Local dev, laptops |
| Neo4j + Qdrant | Production | Docker | High scale, multi-agent |
How the Background Agent Works
Every 30 seconds (configurable), StixDB runs a background cycle on each collection:
- Merge — nodes above
0.88cosine similarity are merged into a summary node. Originals are archived, not deleted. - Deduplicate — exact content duplicates are collapsed (highest importance wins).
- Decay — archived nodes decay with a 48-hour half-life. Nodes below
0.05importance are pruned.
The cycle processes a capped batch of 64 nodes — CPU cost stays flat regardless of collection size.
For Developers
REST API
The StixDB server exposes a REST API on http://localhost:4020 (default).
Health
GET /health
{ "status": "ok", "collections": ["proj_myapp", "proj_other"] }
Store a node
POST /collections/{collection}/nodes
{
"content": "Alice leads the payments team.",
"node_type": "fact",
"importance": 0.8,
"tags": ["team", "payments"]
}
{ "node_id": "abc123", "collection": "proj_myapp", "status": "stored" }
Semantic search
POST /search
{
"query": "who leads payments",
"collection": "proj_myapp",
"top_k": 10,
"threshold": 0.25,
"depth": 1
}
Ask (LLM reasoning)
POST /collections/{collection}/ask
{
"question": "What decisions were made about auth?",
"top_k": 20,
"depth": 2,
"thinking_steps": 1,
"hops_per_step": 4,
"max_tokens": 1024
}
{
"answer": "The team decided to use bcrypt...",
"sources": [...],
"confidence": 0.91
}
Ingest a file
POST /collections/{collection}/ingest
{
"path": "/absolute/path/to/file.md",
"tags": ["docs"],
"chunk_size": 600,
Related Skills
node-connect
350.1kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
109.9kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
350.1kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
350.1kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
