DelphiGenAI
The GenAI API wrapper for Delphi seamlessly integrates OpenAI’s latest models (gpt-5 serie), delivering robust support for agent chats/responses, text generation, vision, audio analysis, JSON configuration, web search, asynchronous operations, and video (SORA-2, SORA-2-pro). Image generation with gpt-image-1.
Install / Use
/learn @MaxiDonkey/DelphiGenAIREADME
Delphi GenAI - Optimized OpenAI Integration
<br>
NEW:
- GetIt current version: 1.4.0
- Changelog v1.4.3 updated on January 11, 2026
- Provider Support and OpenAI API Compatibility
- Local model support via LM Studio (OpenAI-compatible server)
- Deep Research
- Videos using SORA
<br>
- Introduction
- Documentation Overview
- Tips for using the tutorial effectively
- Local model support via LM Studio
- GenAI functional coverage
- Provider Support and OpenAI API Compatibility
- Quick Start Guide
- Tips and tricks
- Deprecated
- Contributing
- License
<br>
Introduction
Built with Delphi 12 Community Edition (v12.1 Patch 1)
The wrapper itself is MIT-licensed.
You can compile and test it free of charge with Delphi CE; any recent commercial Delphi edition works as well.
DelphiGenAI is a full OpenAI wrapper for Delphi, covering the entire platform: text, vision, audio, image generation, video (Sora-2), embeddings, conversations, containers, and the latest v1/responses agentic workflows. It offers a unified interface with sync/async/await support across major Delphi platforms, making it easy to leverage modern multimodal and tool-based AI capabilities in Delphi applications.
<br>[!IMPORTANT]
This is an unofficial library. OpenAI does not provide any official library for
Delphi. This repository containsDelphiimplementation over OpenAI public API.
Documentation Overview
Comprehensive Project Documentation Reference
- Changelog
- About this project
- Detailed documentation with synchronous and asynchronous examples is located in the guides folder.
Tips for using the tutorial effectively
Obtain an API Key
To initialize the API instance, you need to obtain an API key from OpenAI
Once you have a token, you can initialize IGenAI interface, which is an entry point to the API.
[!NOTE]
//uses GenAI, GenAI.Types; //Declare // Client: IGenAI; // Cloud clients Client := TGenAIFactory.CreateInstance(openAI_api_key); // Local client (LM Studio – OpenAI compatible server) Client := TGenAIFactory.CreateLMSInstance; // default: http://127.0.0.1:1234/v1 // or //Client := TGenAIFactory.CreateLMSInstance('http://192.168.1.10:1234'); Client := TGenAIFactory.CreateGeminiInstance(gemini_api_key);
To streamline the use of the API wrapper, the process for declaring units has been simplified. Regardless of which methods you use, you only need to reference the following two core units:
GenAI and GenAI.Types.
<br>[!TIP] To effectively use the examples in this tutorial, particularly when working with asynchronous methods, it is recommended to define the client interfaces with the broadest possible scope. For optimal implementation, these clients should be declared in the application's
OnCreatemethod.
Code examples
The OpenAI API lets you plug advanced models into your applications and production workflows in just a few lines of code. Once billing is enabled on your account, your API keys become active and you can start making requests — including your first call to the chat endpoint within seconds.
<br>Synchronous code example
//uses GenAI, GenAI.Types, GenAI.Tutorial.VCL;
var API_Key := 'OPENAI_API_KEY';
var Client := TGenAIFactory.CreateInstance(API_KEY);
var Value := Client.Responses.Create(
procedure (Params: TResponsesParams)
begin
Params
.Model('gpt-4.1-mini')
.Input('What is the difference between a mathematician and a physicist?')
.Store(False); // Response not stored
end);
try
for var Item in Value.Output do
for var SubItem in Item.Content do
Memo1.Text := SubItem.Text;
finally
Value.Free;
end;
<br>
Asynchronous code example
//uses GenAI, GenAI.Types, GenAI.Tutorial.VCL;
var Client: IGenAI;
procedure TForm1.Test;
begin
var API_Key := 'OPENAI_API_KEY';
Client := TGenAIFactory.CreateInstance(API_KEY);
Client.Responses.AsynCreate(
procedure (Params: TResponsesParams)
begin
Params
.Model('gpt-4.1-mini')
.Input('What is the difference between a mathematician and a physicist?')
.Store(False); // Response not stored
end,
function : TAsynResponse
begin
Result.OnStart :=
procedure (Sender: TObject)
begin
Memo1.Lines.Text := 'Please wait...';
end;
Result.OnSuccess :=
procedure (Sender: TObject; Value: TResponse)
begin
for var Item in Value.Output do
for var SubItem in Item.Content do
Memo1.Text := SubItem.Text;
end;
Result.OnError :=
procedure (Sender: TObject; Error: string)
begin
Memo1.Lines.Text := Error;
end;
end);
end;
<br>
Strategies for quickly using the code examples
To streamline the implementation of the code examples provided in this tutorial, two support units have been included in the source code: GenAI.Tutorial.VCL and GenAI.Tutorial.FMX Based on the platform selected for testing the provided examples, you will need to initialize either the TVCLTutorialHub or TFMXTutorialHub class within the application's OnCreate event, as illustrated below:
<br>[!IMPORTANT] In this repository, you will find in the
samplefolder two ZIP archives, each containing a template to easily test all the code examples provided in this tutorial. Extract theVCLorFMXversion depending on your target platform for testing. Next, add the path to the DelphiGenAI library in your project’s options, then copy and paste the code examples for immediate execution.These two archives are designed to fully leverage the TutorialHub middleware and enable rapid upskilling with DelphiGenAI.
Use file2knowledge
This project, built with DelphiGenAI , allows you to consult GenAI documentation and code in order to streamline and accelerate your upskilling.
GenAI functional coverage
Below, the table succinctly summarizes all OpenAI endpoints supported by the GenAI.
|End point | supported | status / notes|
|--- |:---: |:---: |
| /assistants | <div align="center"><span style="color: green;">●</span></div> | |
| /audio/speech | <div align="center"><span style="color: green;">●</span></div> | |
| /audio/transcriptions | <div align="center"><span style="color: green;">●</span></div> | |
| /audio/translations | <div align="center"><span style="color: green;">●</span></div> | |
| /batches | <div align="center"><span style="color: green;">●</span></div> | |
| /chat/completions | <div align="center"><span style="color: green;">●</span></div> | |
| /chatkit | | |
| /completions | <div align="center"><span style="color: green;">●</span></div> |
|
| /containers | <div align="center"><span style="color: green;">●</span></div> |
|
| /conversations | <div align="center"><span style="color: green;">●</span></div> |
|
| [/embeddings](guides/E
