SkillAgentSearch skills...

CodexKit

CodexKit is an iOS SDK for building OpenAI-powered Codex agents with secure auth, threaded runtime state, streaming responses, and host-defined tools.

Install / Use

/learn @timazed/CodexKit
About this skill

Quality Score

0/100

Supported Platforms

OpenAI Codex

README

CodexKit

CI Version

CodexKit is a lightweight SDK for embedding OpenAI Codex-style agents in Apple apps, with explicit support for iOS and macOS.

main documents the upcoming 2.0 development line. If you are integrating the latest stable release, use the v1.1.0 docs instead.

Who This Is For

Use CodexKit if you are building a SwiftUI app for iOS or macOS and want:

  • ChatGPT sign-in (device code or OAuth)
  • secure session persistence
  • resumable threaded conversations
  • structured local memory with optional prompt injection
  • streamed assistant output
  • typed one-shot text and structured completions
  • host-defined tools with approval gates
  • persona- and skill-aware agent behavior
  • hidden runtime context compaction with preserved user-visible history
  • opt-in developer logging across runtime, auth, backend, and bundled stores
  • share/import-friendly message construction

The SDK stays tool-agnostic. Your app defines the tool surface and runtime UX.

Core Concepts

  • AgentRuntime The main entry point. Owns auth state, threads, tool execution, personas, skills, and optional memory.
  • AgentThread A persistent conversation with its own status, title, persona stack, skill IDs, and optional memory context.
  • UserMessageRequest A single turn request. Can include text, images, imported content, persona override, skill override, and memory selection.
  • CodexResponsesBackend The built-in ChatGPT/Codex-style backend used for text/image/tool turns.
  • ToolDefinition A host-defined capability the model can call through your app.
  • AgentPersonaStack Layered behavior instructions pinned to a thread or applied for one turn.
  • AgentSkill A behavior module that can carry instructions plus tool policy.
  • AgentStructuredOutput A typed Decodable contract for schema-constrained replies.
  • AgentMemoryConfiguration Optional local memory storage, retrieval, ranking, and capture policy.

Choose Your Level

  • Simple chat Sign in, create a thread, and call streamMessage(...) or sendMessage(...).
  • Typed app flows Use sendMessage(..., expecting:) to get a Decodable value back.
  • Tool-driven agents Register host tools and optionally gate them with approvals.
  • Rich behavior Add thread personas, skills, and execution policies.
  • Memory-backed agents Opt into automatic memory capture, guided writing, or raw record management.

Quickstart (5 Minutes)

This quickstart targets the current main branch API surface (2.0 development line).

  1. Add this package to your Xcode project.
  2. Build an AgentRuntime with auth, secure storage, backend, approvals, and state store.
  3. Sign in, create a thread, and send a message.
import CodexKit
import CodexKitUI

let approvalInbox = ApprovalInbox()
let deviceCodeCoordinator = DeviceCodePromptCoordinator()

let runtime = try AgentRuntime(configuration: .init(
    authProvider: try ChatGPTAuthProvider(
        method: .deviceCode,
        deviceCodePresenter: deviceCodeCoordinator
    ),
    secureStore: KeychainSessionSecureStore(
        service: "CodexKit.ChatGPTSession",
        account: "main"
    ),
    backend: CodexResponsesBackend(
        configuration: .init(
            model: "gpt-5.4",
            reasoningEffort: .medium,
            enableWebSearch: true
        )
    ),
    approvalPresenter: approvalInbox,
    stateStore: try GRDBRuntimeStateStore(
        url: FileManager.default.urls(
            for: .applicationSupportDirectory,
            in: .userDomainMask
        ).first!
        .appendingPathComponent("CodexKit/runtime-state.sqlite")
    )
))

let _ = try await runtime.signIn()
let thread = try await runtime.createThread(title: "First Chat")
let stream = try await runtime.streamMessage(
    UserMessageRequest(text: "Hello from Apple platforms."),
    in: thread.id
)

Feature Matrix

| Capability | Support | | --- | --- | | Supported platforms | iOS 17+, macOS 14+ | | iOS auth: device code | Yes | | iOS auth: browser OAuth (localhost callback) | Yes | | Threaded runtime state + restore | Yes | | Streamed assistant output | Yes | | Host-defined tools + approval flow | Yes | | Configurable thinking level | Yes | | Web search toggle (enableWebSearch) | Yes | | Built-in request retry/backoff | Yes (configurable) | | Structured local memory layer | Yes | | Text + image input | Yes | | Typed structured output (Decodable) | Yes | | Mixed streamed text + typed structured output | Yes | | Share/import helper (AgentImportedContent) | Yes | | App Intents / Shortcuts example | Yes | | Assistant image attachment rendering | Yes | | Video/audio input attachments | Not yet | | Built-in image generation API surface | Not yet (tool-based approach supported) |

Package Products

  • CodexKit: core runtime, auth, backend, tools, approvals
  • CodexKitUI: optional SwiftUI-facing helpers

Supported package platforms:

  • iOS 17+
  • macOS 14+

Architecture

flowchart LR
    A["SwiftUI App"] --> B["AgentRuntime"]
    B --> C["ChatGPTAuthProvider"]
    B --> D["SessionSecureStore<br/>KeychainSessionSecureStore"]
    B --> E["RuntimeStateStore<br/>GRDBRuntimeStateStore"]
    B --> F["CodexResponsesBackend"]
    B --> G["ToolRegistry + Executors"]
    B --> H["ApprovalPresenter<br/>ApprovalInbox"]
    F --> I["OpenAI Responses API"]

Recommended Live Setup

The recommended production path for iOS and macOS is:

  • ChatGPTAuthProvider
  • KeychainSessionSecureStore
  • CodexResponsesBackend
  • GRDBRuntimeStateStore
  • ApprovalInbox and DeviceCodePromptCoordinator from CodexKitUI

Bundled runtime-state stores now include:

  • GRDBRuntimeStateStore The recommended production store. Uses SQLite through GRDB, supports migrations, query pushdown, redaction, whole-thread deletion, paged history reads, and lightweight restore/inspection.
  • FileRuntimeStateStore A simple JSON-backed fallback for small apps, tests, or export/import-style workflows.
  • InMemoryRuntimeStateStore Useful for previews and tests.

The bundled memory store is:

  • SQLiteMemoryStore Uses SQLite through GRDB for persisted memory records. Ordinary record reads/writes use GRDB requests directly; the remaining raw SQL is limited to SQLite-specific PRAGMA and FTS MATCH / bm25() paths.

If you are migrating from the older file-backed store, GRDBRuntimeStateStore(url:) automatically imports a sibling *.json runtime state file on first open. For example, runtime-state.sqlite will import from runtime-state.json if it exists and the SQLite store is still empty.

ChatGPTAuthProvider supports:

  • .deviceCode for the most reliable sign-in path
  • .oauth for browser-based ChatGPT OAuth

For browser OAuth, CodexKit uses the Codex-compatible redirect http://localhost:1455/auth/callback internally and only runs the loopback listener during active auth.

Platform Boundary

CodexKit ships a ChatGPT/Codex-style account flow and backend. It does not provide general OpenAI API platform access.

That means:

  • built in: ChatGPT sign-in, Codex-style threaded turns, tools, personas, skills, structured output, and optional local memory
  • not built in: separate API-key-based OpenAI platform clients, Realtime voice sessions, or other non-Codex API access

If your app needs capabilities outside the built-in backend path, the intended approach is to expose them through your own host tools or custom backend integration.

CodexResponsesBackend also includes built-in retry/backoff for transient failures (429, 5xx, and network-transient URL errors like networkConnectionLost). You can tune or disable it:

let backend = CodexResponsesBackend(
    configuration: .init(
        model: "gpt-5.4",
        requestRetryPolicy: .init(
            maxAttempts: 3,
            initialBackoff: 0.5,
            maxBackoff: 4,
            jitterFactor: 0.2
        )
        // or disable:
        // requestRetryPolicy: .disabled
    )
)

CodexResponsesBackendConfiguration also lets you control the model thinking level:

let backend = CodexResponsesBackend(
    configuration: .init(
        model: "gpt-5.4",
        reasoningEffort: .high
    )
)

Developer Logging

CodexKit includes opt-in developer logging for the SDK itself. Logging is disabled by default and can be enabled independently on the runtime, built-in backend, and bundled stores.

let logging = AgentLoggingConfiguration.console(
    minimumLevel: .debug
)

let backend = CodexResponsesBackend(
    configuration: .init(
        model: "gpt-5.4",
        logging: logging
    )
)

let stateStore = try GRDBRuntimeStateStore(
    url: stateURL,
    logging: logging
)

let runtime = try AgentRuntime(configuration: .init(
    authProvider: authProvider,
    secureStore: secureStore,
    backend: backend,
    approvalPresenter: approvalInbox,
    stateStore: stateStore,
    logging: logging
))

You can also filter by category:

let logging = AgentLoggingConfiguration.osLog(
    minimumLevel: .debug,
    categories: [.runtime, .persistence, .network, .tools],
    subsystem: "com.example.myapp"
)

Available logging categories include:

  • auth
  • runtime
  • persistence
  • network
  • retry
  • compaction
  • tools
  • approvals
  • structuredOutput
  • memory

Use AgentConsoleLogSink for stderr-style console logs, AgentOSLogSink for unified Apple logging, or provide your own AgentLogSink.

Custom sinks make it possible to bridge CodexKit logs into your own telemetry or logging pipeline:

struct RemoteTelemetrySink: AgentLogSink {
    func log(_ entry: AgentLogEntry) {
        Telemetry.shared.enqueue(
         
View on GitHub
GitHub Stars22
CategoryDevelopment
Updated3d ago
Forks2

Languages

Swift

Security Score

80/100

Audited on Apr 5, 2026

No findings