DelphiDeepseek
The Deepseek API wrapper for Delphi leverages Deepseek’s advanced models to deliver powerful capabilities for seamless and dynamic conversational interactions, including a model optimized for reasoning, and now also supports running local models through an LM Studio server.
Install / Use
/learn @MaxiDonkey/DelphiDeepseekREADME
Delphi Deepseek
<br/>
NEW:
- GetIt current version: 1.0.4
- Changelog updated on November 26, 2025
- Local model support via LM Studio (OpenAI-compatible server)
<br/> <br/>
Introduction
Built with Delphi 12 Community Edition (v12.1 Patch 1)
The wrapper itself is MIT-licensed.
You can compile and test it free of charge with Delphi CE; any recent commercial Delphi edition works as well.
Deepseek for Delphi is a powerful Delphi library that brings the latest Deepseek APIs to your desktop, mobile, and server apps.
Core capabilities
- Unified access to text endpoints
- Supports state-of-the-art models, including deepseek-chat and the reasoning-centric deepseek-reasoner series
Developer tooling
- Ready-made
Sync,Async, andAwaitcode snippets (TutorialHUB compatible) - Mock-friendly design: the HTTP layer is injected via dependency injection, so you can swap in stubs or fakes for testing
Integrate Deepseek into Delphi—no boilerplate, just results.
<br/><br/>[!IMPORTANT]
This is an unofficial library. Deepseek does not provide any official library for
Delphi. This repository containsDelphiimplementation over Deepseek public API.
Wrapper Tools Info
This section offers concise notifications and explanations about the tools designed to streamline the presentation and clarify the wrapper's functions throughout the tutorial.
<br/>Tools for simplifying this tutorial
Use the FMX or VCL app examples
You can simply use code examples provided in this tutorial, two support units have been included in the source code: Deepseek.Tutorial.VCL and Deepseek.Tutorial.FMX Based on the platform selected for testing the provided examples, you will need to initialize either the TVCLTutorialHub or TFMXTutorialHub class within the application's OnCreate event, as illustrated below:
[!IMPORTANT] In this repository, you will find in the
samplefolder two ZIP archives, each containing a template to easily test all the code examples provided in this tutorial. Extract theVCLorFMXversion depending on your target platform for testing. Next, add the path to the Deepseek library in your project’s options, then copy and paste the code examples for immediate execution.These two archives have been designed to fully leverage the TutorialHub middleware and enable rapid upskilling with Deepseek.
-
VCLsupport with TutorialHUB: TTestDeepseek_VCL.zip -
FMXsupport with TutorialHUB: TestDeepseek_FMX.zip
Simplified Unit Declaration
To streamline the use of the API wrapper, the process for declaring units has been simplified. Regardless of the methods being utilized, you only need to reference the following two core units:
uses
Deepseek, Deepseek.Types;
If required, you may also include the Deepseek.Schema unit or any plugin units developed for specific function calls (e.g., Deepseek.Functions.Example). This simplification ensures a more intuitive and efficient integration process for developers.
Usage
Initialization
To initialize the API instance, you need to obtain an API key from Deepseek.
Once you have a token, you can initialize IDeepseek interface, which is an entry point to the API.
[!NOTE]
uses Deepseek; // Cloud clients var Deepseek := TDeepseekFactory.CreateInstance(API_KEY); var DeepseekBeta := TDeepseekFactory.CreateBetaInstance(API_KEY); // Local client (LM Studio – OpenAI compatible server) var DeepseekLMS := TDeepseekFactory.CreateLMSInstance; // default: http://127.0.0.1:1234/v1 // or: // var DeepseekLMS := TDeepseekFactory.CreateLMSInstance('http://192.168.1.10:1234');
The DeepseekBeta client must be used to access APIs that are currently provided in beta version.
<br/>[!Warning] To effectively use the examples in this tutorial, particularly when working with asynchronous methods, it is recommended to define the Deepseek and DeepseekBeta interfaces with the broadest possible scope. For optimal implementation, these clients should be declared in the application's OnCreate method.
Run models locally with LM Studio
Using non-DeepSeek models in LM Studio
-
Download LM Studio: https://lmstudio.ai/
-
This section assumes you are already familiar with LM Studio (loading models, starting the local OpenAI server, selecting the port, etc.).
The LM Studio backend exposes a fully OpenAI-compatible HTTP server.
Because the Delphi Deepseek wrapper forwards raw OpenAI-format requests to the server, you can load and run any model supported by LM Studio, even if it does not belong to the DeepSeek ecosystem.
Examples of models you can use transparently:
- openai/gpt-oss-20b (OpenAI)
- mistralai/mistral-7b-instruct-v0.3 (Mistral AI)
- NousResearch, Qwen, Falcon, Llama, Gemma, etc.
All these models work seamlessly with:
Chat(sync, async, streaming, promises)FIM(if the model supports it)Parallel promptsTools / function calling(if the model supports it)
You simply need to set:
Params.Model('model-name-as-exposed-by-LM-Studio');
<br>[!NOTE] LM Studio may rename models when exposing them via the OpenAI server. Use the LM Studio UI → OpenAI Server panel to check the exact model identifier.
Embedding models not supported
This wrapper intentionally does not include an Embeddings API, because:
DeepSeek does not provide embedding endpoints in its official REST API.
The LM Studio server exposes embeddings only for models designed for that purpose, but supporting an embeddings client API here would create an inconsistent mismatch between the remote and the local DeepSeek feature set.
Therefore:
- Local embeddings via LM Studio = NOT supported.
- Cloud embeddings via Deepseek = NOT available.
This guarantees that the wrapper remains a strict, coherent implementation of DeepSeek’s documented API surface, while still allowing LM Studio for local LLM inference.
<br>Local chat example (non-streaming)
TutorialHub.Clear;
DeepseekLMS.ClientHttp.ResponseTimeout := 120000;
//Asynchronous promise example
Start(TutorialHub);
var Promise := DeepseekLMS.Chat.AsyncAwaitCreate(
procedure (Params: TChatParams)
begin
Params.Model('deepseek/deepseek-r1-0528-qwen3-8b');
Params.Messages([
FromUser('What is the capital of France, and then the capital of champagne?')
]);
TutorialHub.JSONRequest := Params.ToFormat();
end);
Promise
.&Then<TChat>(
function (Value: TChat): TChat
begin
Result := Value;
Display(TutorialHub, Value);
end)
.&Catch(
procedure (E: Exception)
begin
Display(TutorialHub, E.Message);
end);
<br>
Local streaming example
TutorialHub.Clear;
//Asynchronous promise example
var Promise := DeepseekLMS.Chat.AsyncAwaitCreateStream(
procedure (Params: TChatParams)
begin
Params.Model('deepseek/deepseek-r1-0528-qwen3-8b');
Params.Messages([
FromUser('Does art belong to the artist or to his audience?')
]);
Params.Stream;
TutorialHub.JSONRequest := Params.ToFormat();
end,
function : TPromiseChatStream
begin
Result.Sender := TutorialHub;
Result.OnProgress :=
procedure (Sender: TObject; Chunk: TChat)
begin
DisplayStream(Sender, Chunk);
end;
end);
promise
.&Then<TPromiseBuffer>(
function (Value: TPromiseBuffer): TPromiseBuffer
begin
Result :
