PSUnplugged
PowerShell orchestration shell for OpenAI Codex App Server
Install / Use
/learn @dfinke/PSUnpluggedREADME
PSUnplugged
<br/> <div align="center"> <img src="media/PSUnplugged.png" alt="alt text" width="45%"> </div> <br/>PSUnplugged is a PowerShell orchestration shell for OpenAI’s Codex App Server. It gives PowerShell users a terminal-native way to drive the Codex runtime while inheriting native support for conversation history, approvals, streamed agent events, AGENTS.md, MCP servers, skills, and plugins. Instead of rebuilding an agent stack from scratch, PSUnplugged makes PowerShell the control surface for reasoning, execution, and automation.
<!-- PSUnplugged brings agentic AI directly into your terminal — no IDE, no extensions, no GUI. It's a PowerShell-native client for the OpenAI Codex App Server, giving you multi-turn conversations, streaming responses, and full Markdown rendering right where you already work: the command line. This is PowerShell as it was always meant to be used — not as a side panel in someone else's editor, but as the runtime itself. PSUnplugged is the bridge between the shell you know and the agentic future that's already here. -->This release is read-only. The agent can read files, answer questions, and reason about your code — but won't write or execute anything. Read/write mode with approval flow is coming in AI Agent Forge.
🚀 Join the AI Agent Forge Community
PSUnplugged is the beginning. The next phase — read/write mode, approval flows, MCP tool integration, and a full agentic workflow engine — is launching through AI Agent Forge.
AI Agent Forge is a community for PowerShell developers stepping into the agentic AI era. Early members get first access to new capabilities as they ship, direct input into the roadmap, and a front-row seat to what's coming next.
Why
The IDE with a side-panel chat window is a fossil. The future is agentic workflows running wherever your code runs — including the terminal you already have open.
PSUnplugged talks directly to the Codex App Server over JSON-RPC via stdio. It gives you a first-class agentic experience from pure PowerShell, on any machine, in any pipeline.
And because the Codex App Server is provider-agnostic, so is PSUnplugged. Point it at OpenAI, Azure, Ollama, Mistral — swap a line in config.toml and you're done.
What's Inside
PSUnplugged.psm1 # module — session, threads, turns, JSON-RPC
PSUnplugged.psd1 # module manifest (PowerShell Gallery ready)
ShowMarkdown.psm1 # terminal Markdown renderer
Threads/
README.md # project-aware thread/project UX and usage
PSUnplugged.Threads.psm1 # higher-level project/thread catalog commands
Examples/
Start-AgentChat.ps1 # interactive REPL — multi-turn chat, streaming, slash commands
QuickStart.ps1 # working examples for every feature
| File | What it does |
|---|---|
| PSUnplugged.psm1 | Full PowerShell module — session management, threads, turns, low-level JSON-RPC |
| PSUnplugged.psd1 | Module manifest — version, author, tags, Gallery metadata |
| Threads/PSUnplugged.Threads.psm1 | Higher-level thread and project catalog commands |
| Threads/README.md | Usage guide for project-aware thread management |
| Examples/Start-AgentChat.ps1 | Interactive REPL — multi-turn chat, streaming, slash commands |
| Examples/QuickStart.ps1 | Working examples for every feature |
| ShowMarkdown.psm1 | Terminal Markdown renderer — headers, code blocks, tables with box-drawing chars |
Quick Start
Prerequisites
- Node.js 18+ — nodejs.org (provides
npm) - PowerShell 7+ — aka.ms/powershell (Windows ships with 5.1; this module requires 7)
- Codex CLI
npm i -g @openai/codex - Authenticate — choose one:
- ChatGPT account (free tier, supports
gpt-5.1-codexonly):codex login - OpenAI API key (required for
gpt-4.1,gpt-4o, etc.) — pass it directly at runtime, no login needed:.\Examples\Start-AgentChat.ps1 -ApiKey $env:OPENAI_API_KEY
- ChatGPT account (free tier, supports
- Clone this repo
git clone https://github.com/dfinke/PSUnplugged cd PSUnplugged
Windows tip: If
Start-CodexSessioncan't find the binary, set$env:CODEX_EXEto the full path ofcodex.exe:$env:CODEX_EXE = (Get-ChildItem (npm root -g) -Recurse -Filter codex.exe | Where-Object { $_.Length -gt 1MB } | Select-Object -First 1).FullName
Interactive chat
# ChatGPT account (default model)
.\Examples\Start-AgentChat.ps1
# OpenAI API key — use any model
.\Examples\Start-AgentChat.ps1 -Model gpt-4.1 -ApiKey $env:OPENAI_API_KEY
One-liner from a script
Import-Module .\PSUnplugged.psm1
$session = Start-CodexSession
$answer = Invoke-CodexQuestion -Session $session -Text "What does this repo do?"
Write-Host $answer
Stop-CodexSession -Session $session
Multi-turn conversation
$session = Start-CodexSession
$thread = New-CodexThread -Session $session -Cwd (Get-Location).Path
$r1 = Invoke-CodexTurn -Session $session -ThreadId $thread.id -Text "List the files here"
$r2 = Invoke-CodexTurn -Session $session -ThreadId $thread.id -Text "Now explain what each one does"
Stop-CodexSession -Session $session
Slash commands inside the chat REPL
/new start a fresh thread
/model <n> switch model and start a new thread
/verbose toggle raw JSON-RPC output
/quit exit
Thread settings used by Start-AgentChat.ps1
The interactive example starts each thread with these Codex App Server settings:
approvalPolicy = "never"- Do not pause for approval prompts.
- Other supported values in this module are
on-requestandunless-trusted.
sandbox = "workspace-write"- Use the workspace-write sandbox mode for the thread.
- Other supported values in this module are
read-onlyanddanger-full-access.
These values are also exposed by New-CodexThread in PSUnplugged.psm1 as:
-ApprovalPolicy-SandboxType
<!-- Provider-Agnostic section under review — wire_api compat being validated across providers ## Provider-Agnostic The Codex App Server supports any OpenAI-compatible provider. Add a block to `~/.codex/config.toml` (global) or `.codex/config.toml` in your project root (project-scoped) and point `env_key` at the environment variable holding your API key — Codex picks it up automatically, nothing hardcoded. > **Project-scoped config:** drop a `.codex/config.toml` in your repo root and Codex uses it automatically for that project (trusted projects only). Providers, MCP servers, model settings — all scoped to that repo. Nothing bleeds into your global config. Example `config.toml`: ```toml # xAI Grok [model_providers.xai] name = "xAI" base_url = "https://api.x.ai/v1" env_key = "XAI_API_KEY" wire_api = "chat" # Mistral [model_providers.mistral] name = "Mistral" base_url = "https://api.mistral.ai/v1" env_key = "MISTRAL_API_KEY" wire_api = "chat" # Ollama (local, no key needed) [model_providers.ollama] name = "Ollama" base_url = "http://localhost:11434/v1" # Azure OpenAI [model_providers.azure] name = "Azure OpenAI" base_url = "https://YOUR_RESOURCE.openai.azure.com/openai/v1" env_key = "AZURE_OPENAI_API_KEY" wire_api = "responses" ``` Then in PowerShell: ```powershell $env:XAI_API_KEY = "your-key-here" ``` Or for a quick one-off redirect without touching config.toml: ```powershell $env:OPENAI_BASE_URL = "http://localhost:11434/v1" ``` > **Note:** Providers that don't speak the OpenAI wire format (like Anthropic) need a translation proxy such as [LiteLLM](https://github.com/BerriAI/litellm). -->
More Than a Chat Client
The Codex App Server isn't just a model endpoint — it's a full agentic runtime. Think of it as MCP on steroids:
- MCP client built in — wire up any MCP server, local or remote, and the model sees its tools natively
- Threads and memory — persistent multi-turn conversations with full history
- Approval policies — control whether the agent can execute commands and modify files, or stay read-only
- AGENTS.md and Skills — already have an AGENTS.md in your repo? It works automatically. Your project context, your instructions, zero extra config
AGENTS.mdlives at your repo root (e.g.c:\PSUnplugged\AGENTS.md) — the Codex App Server picks it up automatically for any trusted project checkout- Skills live under
.codex/skills/in your repo root (e.g..codex/skills/my-skill.md) — each file describes a reusable capability the model can invoke by name - A global
AGENTS.mdcan also live at~/.codex/AGENTS.mdto apply instructions across all projects - Skills and AGENTS.md stack: global instructions + repo-level instructions + skills are all merged into the agent's context at session start
- Provider-agnostic — swap models without changing client code
PSUnplugged is the PowerShell binding to that runtime. When OpenAI ships the cloud version of the app-server, the same code points at a URL instead of a local process.
Coming Next
Read/write mode — the agent can execute commands, modify files, and take action in your repo. Includes an approval flow so nothing runs without your sign-off. Dropping at launch of AI Agent Forge.
MCP support — local and remote servers, drop-in config. The model sees them as callable tools. Web search, databases, custom APIs — no extra client code needed.
Much more
Related Skills
node-connect
354.3kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
112.3kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
354.3kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
354.3kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
