Rusty Intervals MCP
Fast and lightweight MCP server for intervals.icu
Install / Use
/learn @like-a-freedom/Rusty Intervals MCPQuality Score
Category
Development & EngineeringSupported Platforms
README
Intent-Driven, Token-Efficient Intervals.icu MCP Server
A high-performance Rust MCP server for Intervals.icu designed around one idea: an LLM should interact with a small, semantically rich coaching interface, not a raw pile of endpoint wrappers.
Public contract: 8 high-level intents + 1 resource
Internal execution layer: dynamic OpenAPI runtime that stays aligned with Intervals.icu
Design goal: respect the agent's context window and return decision-ready coaching context
Table of Contents
- Why this project exists
- What makes it different
- Public MCP surface
- Architecture at a glance
- Why it is token-efficient
- Quick start
- VS Code / Copilot setup
- Claude Desktop setup
- Example asks
- Deterministic coaching analytics
- Observability & Metrics
- Runtime configuration
- Development
- Docker and remote deployment
- Documentation map
- License
- Disclaimer
Why this project exists
The original foundation of this project was strong: build tool behavior dynamically from the live Intervals.icu OpenAPI spec so the MCP server does not drift as the upstream API evolves.
That solved the maintenance problem.
It did not solve the agent UX problem.
Exposing one tool per API endpoint creates the exact failure mode modern MCP design tries to avoid:
- too many tools loaded into context
- too much low-level API detail exposed to the model
- too many multi-step orchestration burdens pushed onto the LLM
- more chances for bad tool selection, invalid arguments, and wasted tokens
This project now takes a different approach:
- keep the dynamic OpenAPI layer internally, where it belongs
- expose a capability-level intent surface to the LLM
- return structured, guidance-driven outputs instead of raw payloads
- compute important coaching metrics on the server, not in the model's head
In other words: dynamic under the hood, curated at the boundary.
What makes it different
1. Intent-driven public interface
The LLM sees 8 high-level intents such as analyze_training or modify_training, not dozens of endpoint-shaped tools.
2. Dynamic OpenAPI runtime retained internally
This is not a hand-maintained wrapper that goes stale. The server still loads the Intervals.icu OpenAPI spec dynamically and uses it as the execution layer behind the intent orchestration.
3. Token-efficiency by default
Responses are designed for LLMs:
- structured and compact
- pre-filtered
- pre-aggregated
- guidance-rich
4. Deterministic coaching analytics
Read-only coaching intents use a deterministic pipeline to compute metrics such as readiness, ACWR context, monotony, strain, fatigue index, stress tolerance, durability index, recovery interpretation, and stream-derived execution signals.
5. Safer mutation flows
Mutating intents are designed for agents:
- business identifiers instead of opaque system-first flows
dry_runpreviews for risky changesidempotency_tokensupport for safe retries
6. Rust-first operational profile
- single binary
- fast startup
- strong type safety
- good fit for local MCP, containers, and remote HTTP deployments
Public MCP surface
The public MCP contract is intentionally small and stable.
Intents
| Intent | Purpose | Mutating | Example ask |
|---|---|---:|---|
| plan_training | Create training plans across any horizon | ✅ | “Build me a 12-week 50K plan” |
| analyze_training | Analyze a single workout or a training period | ❌ | “Analyze yesterday’s workout” |
| modify_training | Move, edit, create, or delete workouts and events | ✅ | “Move Saturday’s workout to Sunday” |
| compare_periods | Compare two blocks of training | ❌ | “Compare this month vs last month” |
| assess_recovery | Assess readiness, recovery, and red flags | ❌ | “Am I ready for intensity tomorrow?” |
| manage_profile | View or update thresholds, zones, and profile settings | ✅ | “Update my threshold values from a lab test” |
| manage_gear | List, add, or retire gear | ✅ | “How much mileage is on my shoes?” |
| analyze_race | Post-race analysis and follow-up guidance | ❌ | “How did my 50K go?” |
Resource
| Resource | Purpose |
|---|---|
| intervals-icu://athlete/profile | Ongoing athlete context including profile and fitness-related information |
Public contract rules
- Names are outcome-oriented, not endpoint-oriented.
- Arguments are flattened so agents do not have to invent nested structures.
- Successful intent results use structured MCP output via
structuredContent. - Intent tool calls avoid duplicating the same payload into text
content, reducing token waste. - Error and partial states are guidance-driven, so the model is told what to do next.
Architecture at a glance
LLM Host (VS Code / Claude / Cursor / other MCP client)
|
| calls one high-level intent
v
+-----------------------------+
| Intent Layer |
| 8 public coaching intents |
+-----------------------------+
|
v
+-----------------------------+
| Intent Router |
| validation + idempotency |
| orchestration + rendering |
+-----------------------------+
|
v
+-----------------------------+
| Internal Execution Layer |
| dynamic OpenAPI runtime |
| Intervals client |
+-----------------------------+
|
v
Intervals.icu API
Layering philosophy
This README describes the project as a capability-level MCP server:
- the LLM interacts with goals
- the server handles orchestration
- the OpenAPI runtime remains the internal product/component layer
That separation is the core design decision behind the current architecture.
Why it is token-efficient
This server is designed to reduce both static tool metadata cost and dynamic response cost.
Smaller tool surface
Instead of flooding the model with endpoint-shaped tools, the server exposes only the intent surface that matters most in real coaching workflows.
Compact outputs
Responses are shaped for actionability:
- summaries before detail
- decision-ready metrics before raw JSON
- markdown tables and structured content instead of schema dumps
- selective enrichment only when it changes the decision
Server-side computation
The server computes important metrics and interpretations itself, including portions of:
- readiness context
- fatigue and load guidance (Fatigue Index, Stress Tolerance, Durability Index)
- stream-aware execution signals
- training period summaries
This keeps the model focused on reasoning with the result rather than reconstructing the result.
Guidance-driven follow-up
Intent responses include suggestions and next_actions so the host model knows how to continue without trial-and-error tool calling.
Quick start
Prerequisites
- Rust 1.94+ with Cargo, or
- Docker
Get your Intervals.icu credentials
- Open https://intervals.icu/settings
- Scroll to the Developer section
- Create an API key
- Copy your API key
- Note your athlete ID from your profile URL (format:
i123456)
Install and run the MCP server
The server supports both STDIO (for local MCP clients) and HTTP (for remote clients) transport modes via the MCP_TRANSPORT environment variable.
git clone https://github.com/like-a-freedom/rusty-intervals-mcp.git
cd rusty-intervals-mcp
cp .env.example .env
# edit .env and set:
# INTERVALS_ICU_API_KEY=your_api_key_here
# INTERVALS_ICU_ATHLETE_ID=i123456
cargo install --locked --path crates/intervals_icu_mcp
STDIO mode (default)
For local MCP clients like VS Code Copilot or Claude Desktop:
export INTERVALS_ICU_API_KEY=your_api_key_here
export INTERVALS_ICU_ATHLETE_ID=i123456
export MCP_TRANSPORT=stdio # optional: stdio is the default
intervals_icu_mcp
HTTP mode
For remote MCP clients or when running as a service:
# Generate secret for JWT authentication
export JWT_MASTER_KEY=$(openssl rand -hex 64)
export INTERVALS_ICU_API_KEY=your_api_key_here
export INTERVALS_ICU_ATHLETE_ID=i123456
export MCP_TRANSPORT=http
export MCP_HTTP_ADDRESS=127.0.0.1:3000 # optional: default is 127.0.0.1:3000
export MAX_HTTP_BODY_SIZE=4194304 # optional: 4 MiB request limit
export REQUEST_TIMEOUT_SECONDS=30 # optional: per-request timeout
export IDLE_TIMEOUT_SECONDS=60 # optional: idle connection timeout
intervals_icu_mcp
The MCP endpoint is available at http://<address>/mcp.
Authenticating with /auth
Exchange your Intervals.icu API key for a JWT:
curl -s -X POST http://127.0.0.1:3000/auth \
-H "Content-Type: application/json" \
-d '{"api_key": "your_api_key_here", "athlete_id": "i123456"}'
Response:
{
"token": "<jwt>",
"exp
