SemanticDeveloper
Cross-platform desktop UI for the Codex CLI, built with Avalonia/.NET 8. Runs Codex in app-server mode with real-time streaming, file browsing, Git integration, profile selection, MCP server management, live token stats, auto-approval of exec/patch requests and multi-session support.
Install / Use
/learn @semantic-developer/SemanticDeveloperQuality Score
Category
Customer SupportSupported Platforms
README
Semantic Developer
A cross‑platform desktop UI (Avalonia/.NET 8) for driving the Codex CLI app server using its JSON protocol. It lets you:
- Select a workspace folder and browse files via a lazy file tree
- Start a Codex session and stream assistant output in real time
- Send user input that is wrapped as protocol
Submissions (app server) - Auto‑approve exec/patch requests (automatic)
- Pick a model (built-in or from
config.tomlprofiles) and load MCP servers from the Codex CLIconfig.toml([mcp_servers]section); see Windows setup for a WSL recipe - Keep multiple Codex sessions active at once using the tabbed header (each tab title shows its live status, e.g.,
Session 2 – thinking…) – See live token usage and estimated context remaining in the header
Important: This app runs Codex through the
app-serversubcommand.
Requirements
- .NET SDK 8.0+
- Codex CLI v0.77.0 or newer installed and on
PATH- Verify with:
codex app-server --help
- Verify with:
- No external Git required — uses LibGit2Sharp for repo init/staging/commit
Platform Setup & CLI Modes
Semantic Developer can drive the Codex CLI from Linux, macOS, or Windows.
Linux
- Install the .NET 8 SDK and the Codex CLI in your Linux environment.
- Profiles & MCP servers live under
~/.codex/config.toml(respects$CODEX_HOME); prompts under~/.codex/prompts/.
macOS
- Install .NET 8 (e.g.,
brew install dotnet-sdk) and the Codex CLI (brew install codexor the official installer). - Profiles & MCP servers:
~/.codex/config.toml(respects$CODEX_HOME); prompts under~/.codex/prompts/.
Windows
- RECOMMENDED see Windows setup recipe
- Install the Windows .NET 8 SDK and Codex CLI for Windows, ensuring
codex.exeis on your WindowsPATH. - Profiles & MCP servers default to
%USERPROFILE%\.codex\config.toml(respectsCODEX_HOME); prompts live under%USERPROFILE%\.codex\prompts\.
Build & Run
- Restore/build:
dotnet build SemanticDeveloper/SemanticDeveloper.sln
- Run the app:
dotnet run --project SemanticDeveloper/SemanticDeveloper
Usage
- Open the app, click “Select Workspace…” and choose a folder.
- If it isn’t a git repo and the Git library is available, you’ll be prompted to initialize one.
- You can also initialize later from the header via “Initialize Git…”.
- Click “Restart Session” to launch
codex app-serverin the workspace directory (a session also starts automatically after you select a workspace). - Type into the input box and press Enter to send. Output appears in the right panel.
- “CLI Settings” lets you change:
- Model & reasoning effort
- Before a session starts, the picker loads from
SemanticDeveloper/SemanticDeveloper/models.jsonso you can choose models offline. Keep this file updated as Codex releases new entries. - When the app connects to Codex, the dialog refreshes with the live catalog.
- Any profiles defined in
config.toml(e.g.,$CODEX_HOME/config.toml, defaulting to~/.codex/config.toml) are appended to the list and marked with an asterisk (*). Selecting a profile locks the reasoning controls and lets the profile determine the model/effort. - Profiles are optional—if you don’t have one, simply pick a built-in model.
- Before a session starts, the picker loads from
- Verbose logging (show suppressed output)
- Model & reasoning effort
- Enable MCP support (mirrors the
[mcp_servers]entries from your Codex CLIconfig.tomland passes them directly to Codex)- Config defaults to
~/.codex/config.tomlon Linux/macOS and%USERPROFILE%\.codex\config.tomlon Windows; both honor$CODEX_HOME.
- Config defaults to
- Use API Key for Codex CLI (pipes the key to
codex login --with-api-keybefore sessions; does not rely on existing CLI auth)- Allow network access for tools (sets sandbox_policy.network_access=true on turns so MCP tools can reach the network)
- Without API key enabled, the app proactively authenticates with
codex auth login(falling back tocodex login) before sessions so your chat/GPT token is used.
- Need a second workspace or want to keep another Codex stream alive? Hit the + button next to the session tabs to spin up a parallel session—tab titles update in real time so you can see whether each workspace is
disconnected,thinking…, oridle. - Right-click a tab to rename it or use the per-session Close Tab button/context menu to shut it down when you are done.
Directory Guardrails with AGENTS.md
- Right-click any folder in the workspace tree and choose Create AGENTS.md to seed a directory-specific instruction file.
- The file is created inside the chosen folder (or opened if it already exists) and loaded into the editor so you can tailor the guidance.
- Codex CLI automatically honors the closest
AGENTS.mdwhen editing files: deeper files override parent folders, and rules apply to the entire subtree beneath the file. - Use these files to capture coding conventions, test commands, or “do/don’t” rules that the agent must follow for that part of the repo.
- Learn more about the convention at agents.md.
Profiles (config.toml) example
If you define profiles in config.toml, Semantic Developer surfaces them in the model picker (marked with *) alongside the built-in catalog. They’re entirely optional—the app works out of the box with the bundled models.
Example config.toml profiles:
# Optional: define provider(s) once, then reference by model_provider.
[model_providers.ollama]
name = "Ollama"
base_url = "http://localhost:11434/v1"
[profiles.gpt-5.2-codex-high]
model = "gpt-5.2-codex"
model_provider = "openai"
approval_policy = "never"
model_reasoning_effort = "high"
model_reasoning_summary = "auto"
[profiles.gpt-5.2-codex-medium]
model = "gpt-5.2-codex"
model_provider = "openai"
approval_policy = "never"
model_reasoning_effort = "medium"
model_reasoning_summary = "auto"
[profiles.gpt-5.2-codex-low]
model = "gpt-5.2-codex"
model_provider = "openai"
approval_policy = "never"
model_reasoning_effort = "low"
model_reasoning_summary = "auto"
[profiles.ollama-qwen3]
model = "qwen3-coder:30b"
model_provider = "ollama"
approval_policy = "never"
model_reasoning_effort = "medium"
model_reasoning_summary = "auto"
-
The left file tree and right log pane are resizable using the vertical splitter between them.
-
The header shows:
- Current status:
idle,thinking…,responding…,applying patch…,starting…, orerror. - A soft indeterminate progress bar while busy.
- Token stats (when available):
tokens <blended-total> • <percent> left. The percent remaining is an estimate based on the model’s context window and may differ slightly from the server’s internal view. - When inside a Git repository: current branch and a small Git menu for quick actions.
- Current status:
Git Integration
The app integrates basic Git operations directly in the header. All actions use LibGit2Sharp (embedded libgit2); the system git command is not required.
-
Branch indicator
- Shows the current branch (e.g.,
main) after the workspace path when the selected folder is inside a Git repo.
- Shows the current branch (e.g.,
-
Git menu (Git ▾)
- Commit…
- Stages all changes (
*) and creates a commit with the provided message. - Uses your Git config for name/email if available; otherwise falls back to a local signature like
<user>@local. - If there are no changes, you’ll get a friendly “No changes to commit.” notice.
- Automatically pushes the current branch to its tracked remote (defaults to
origin). - Optional: tick Create Pull Request to open your browser to a GitHub compare page after a successful push.
- Stages all changes (
- New Branch…
- Creates and checks out a new branch based on the default branch when available.
- Behavior details:
- Performs a best‑effort
fetchfromoriginfirst (no merge/rebase into your working copy). - Bases the new branch on, in order of preference:
origin/main,origin/master, localmain, localmaster, then currentHEAD. - Example log:
Created and checked out 'feature-x' (based on origin/main).
- Performs a best‑effort
- Switch Branch…
- Checks out an existing branch by name (no automatic fetch/merge).
- Get Latest
- Fetches from the tracked remote (defaults to
origin) and fast-forwards the current branch when possible. - Requires the branch to track a remote counterpart; otherwise a helpful log message is shown.
- Stops early if a merge or rebase would be required (fast-forward only).
- Fetches from the tracked remote (defaults to
- Rollback Changes…
- Hard‑resets the working directory to
HEADand deletes untracked files. - Prompts for confirmation since this discards local changes.
- Hard‑resets the working directory to
- Refresh
- Refreshes the branch label and the file tree’s Git status coloring.
- Commit…
Example workflow
- Switch to an existing base branch (e.g.,
mainormaster). - Choose Git ▾ → Get Latest to fast-forward your local branch.
- Use Git ▾ → New Branch… with your preferred naming convention (e.g.,
feature/login-form). - After making changes, select Commit…, enter a message, let the app push the branch for you, and optionally enable Create Pull Request to jump straight to GitHub once the push completes.
- Initialize Git…
- When the workspace is not a Git repo, an “Initialize Git…” button appears in the header.
- Initializes a repository in the selected folder, stages files, and attempts an initial commit (best‑effort).
- This is the same capability offered right after selecting a non‑repo folder.
Notes
- Operations are local unless a remote call is required (the optional
fetchduring “New Branch…”, the fast-forward fetch performed by “Get Latest”, and the push that runs after each commit). - Open your workspace at the root of the Git repository (the folder containing
.git/) so the app can detect and enable Git features; selecting a subdirectory skips the Git UI.
Related Skills
openhue
349.9kControl Philips Hue lights and scenes via the OpenHue CLI.
sag
349.9kElevenLabs text-to-speech with mac-style say UX.
weather
349.9kGet current weather and forecasts via wttr.in or Open-Meteo
casdoor
13.3kAn open-source AI-first Identity and Access Management (IAM) /AI MCP & agent gateway and auth server with web UI supporting OpenClaw, MCP, OAuth, OIDC, SAML, CAS, LDAP, SCIM, WebAuthn, TOTP, MFA, Face ID, Google Workspace, Azure AD
