SkillAgentSearch skills...

OpenAI

Swift community driven package for OpenAI public API

Install / Use

/learn @MacPaw/OpenAI
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

OpenAI

logo


Swift Workflow Twitter

This repository contains Swift community-maintained implementation over OpenAI public API.

Documentation

This library implements it's types and methods in close accordance to the REST API documentation, which can be found on platform.openai.com.

Installation

Swift Package Manager

To integrate OpenAI into your Xcode project using Swift Package Manager:

  1. In Xcode, go to File > Add Package Dependencies...
  2. Enter the repository URL: https://github.com/MacPaw/OpenAI.git
  3. Choose your desired dependency rule (e.g., "Up to Next Major Version").

Alternatively, you can add it directly to your Package.swift file:

dependencies: [
    .package(url: "https://github.com/MacPaw/OpenAI.git", branch: "main")
]

Usage

Initialization

To initialize API instance you need to obtain API token from your Open AI organization.

Remember that your API key is a secret! Do not share it with others or expose it in any client-side code (browsers, apps). Production requests must be routed through your own backend server where your API key can be securely loaded from an environment variable or key management service.

<img width="1081" alt="company" src="https://user-images.githubusercontent.com/1411778/213204726-0772373e-14db-4d5d-9a58-bc249bac4c57.png">

Once you have a token, you can initialize OpenAI class, which is an entry point to the API.

⚠️ OpenAI strongly recommends developers of client-side applications proxy requests through a separate backend service to keep their API key safe. API keys can access and manipulate customer billing, usage, and organizational data, so it's a significant risk to expose them.

let openAI = OpenAI(apiToken: "YOUR_TOKEN_HERE")

Optionally you can initialize OpenAI with token, organization identifier and timeoutInterval.

let configuration = OpenAI.Configuration(token: "YOUR_TOKEN_HERE", organizationIdentifier: "YOUR_ORGANIZATION_ID_HERE", timeoutInterval: 60.0)
let openAI = OpenAI(configuration: configuration)

See OpenAI.Configuration for more values that can be passed on init for customization, like: host, basePath, port, scheme and customHeaders.

Once you posses the token, and the instance is initialized you are ready to make requests.

Using the SDK for other providers except OpenAI

This SDK is more focused on working with OpenAI Platform, but also works with other providers that support OpenAI-compatible API.

Use .relaxed parsing option on Configuration, or see more details on the topic here

Cancelling requests

For Swift Concurrency calls, you can simply cancel the calling task, and corresponding underlying URLSessionDataTask would get cancelled automatically.

let task = Task {
    do {
        let chatResult = try await openAIClient.chats(query: .init(messages: [], model: "asd"))
    } catch {
        // Handle cancellation or error
    }
}
            
task.cancel()
<details> <summary>Cancelling closure-based API calls</summary>

When you call any of the closure-based API methods, it returns discardable CancellableRequest. Hold a reference to it to be able to cancel the request later.

let cancellableRequest = object.chats(query: query, completion: { _ in })
cancellableReques
</details> <details> <summary>Cancelling Combine subscriptions</summary> In Combine, use a default cancellation mechanism. Just discard the reference to a subscription, or call `cancel()` on it.
let subscription = openAIClient
    .images(query: query)
    .sink(receiveCompletion: { completion in }, receiveValue: { imagesResult in })
    
subscription.cancel()
</details>

Text and prompting

Responses

Use responses variable on OpenAIProtocol to call Responses API methods.

public protocol OpenAIProtocol {
    // ...
    var responses: ResponsesEndpointProtocol { get }
    // ...
}

Specify params by passing CreateModelResponseQuery to a method. Get ResponseObject or a stream of ResponseStreamEvent events in response.

Example: Generate text from a simple prompt

let client: OpenAIProtocol = /* client initialization code */

let query = CreateModelResponseQuery(
    input: .textInput("Write a one-sentence bedtime story about a unicorn."),
    model: .gpt4_1
)

let response: ResponseObject = try await client.responses.createResponse(query: query)
// ...
<details> <summary>print(response)</summary>
ResponseObject(
  createdAt: 1752146109,
  error: nil,
  id: "resp_686fa0bd8f588198affbbf5a8089e2d208a5f6e2111e31f5",
  incompleteDetails: nil,
  instructions: nil,
  maxOutputTokens: nil,
  metadata: [:],
  model: "gpt-4.1-2025-04-14",
  object: "response",
  output: [
    OpenAI.OutputItem.outputMessage(
      OpenAI.Components.Schemas.OutputMessage(
        id: "msg_686fa0bee24881988a4d1588d7f65c0408a5f6e2111e31f5",
        _type: OpenAI.Components.Schemas.OutputMessage._TypePayload.message,
        role: OpenAI.Components.Schemas.OutputMessage.RolePayload.assistant,
        content: [
          OpenAI.Components.Schemas.OutputContent.OutputTextContent(
            OpenAI.Components.Schemas.OutputTextContent(
              _type: OpenAI.Components.Schemas.OutputTextContent._TypePayload.outputText,
              text: "Under a sky full of twinkling stars, a gentle unicorn named Luna danced through fields of stardust, spreading sweet dreams to every sleeping child.",
              annotations: [],
              logprobs: Optional([])
            )
          )
        ],
        status: OpenAI.Components.Schemas.OutputMessage.StatusPayload.completed
      )
    )
  ],
  parallelToolCalls: true,
  previousResponseId: nil,
  reasoning: Optional(
    OpenAI.Components.Schemas.Reasoning(
      effort: nil,
      summary: nil,
      generateSummary: nil
    )
  ),
  status: "completed",
  temperature: Optional(1.0),
  text: OpenAI.Components.Schemas.ResponseProperties.TextPayload(
    format: Optional(
      OpenAI.Components.Schemas.TextResponseFormatConfiguration.ResponseFormatText(
        OpenAI.Components.Schemas.ResponseFormatText(
          _type: OpenAI.Components.Schemas.ResponseFormatText._TypePayload.text
        )
      )
    ),
    toolChoice: OpenAI.Components.Schemas.ResponseProperties.ToolChoicePayload.ToolChoiceOptions(
      OpenAI.Components.Schemas.ToolChoiceOptions.auto
    ),
    tools: [],
    topP: Optional(1.0),
    truncation: Optional("disabled"),
    usage: Optional(
      OpenAI.Components.Schemas.ResponseUsage(
        inputTokens: 18,
        inputTokensDetails: OpenAI.Components.Schemas.ResponseUsage.InputTokensDetailsPayload(
          cachedTokens: 0
        ),
        outputTokens: 32,
        outputTokensDetails: OpenAI.Components.Schemas.ResponseUsage.OutputTokensDetailsPayload(
          reasoningTokens: 0
        ),
        totalTokens: 50
      )
  
View on GitHub
GitHub Stars2.9k
CategoryDevelopment
Updated19h ago
Forks511

Languages

Swift

Security Score

100/100

Audited on Apr 3, 2026

No findings