ChatGptNet
A ChatGPT integration library for .NET, supporting both OpenAI and Azure OpenAI Service
Install / Use
/learn @marcominerva/ChatGptNetREADME
ChatGPT for .NET
A ChatGPT integration library for .NET, supporting both OpenAI and Azure OpenAI Service.
Installation
The library is available on NuGet. Just search for ChatGptNet in the Package Manager GUI or run the following command in the .NET CLI:
dotnet add package ChatGptNet
Configuration
Register ChatGPT service at application startup:
builder.Services.AddChatGpt(options =>
{
// OpenAI.
//options.UseOpenAI(apiKey: "", organization: "");
// Azure OpenAI Service.
//options.UseAzure(resourceName: "", apiKey: "", authenticationType: AzureAuthenticationType.ApiKey);
options.DefaultModel = "my-model";
options.DefaultEmbeddingModel = "text-embedding-ada-002";
options.MessageLimit = 16; // Default: 10
options.MessageExpiration = TimeSpan.FromMinutes(5); // Default: 1 hour
options.DefaultParameters = new ChatGptParameters
{
MaxTokens = 800,
//MaxCompletionTokens = 800, // o1 series models support this property instead of MaxTokens
Temperature = 0.7
};
});
ChatGptNet supports both OpenAI and Azure OpenAI Service, so it is necessary to set the correct configuration settings based on the chosen provider:
OpenAI (UseOpenAI)
- ApiKey: it is available in the User settings page of the OpenAI account (required).
- Organization: for users who belong to multiple organizations, you can also specify which organization is used. Usage from these API requests will count against the specified organization's subscription quota (optional).
Azure OpenAI Service (UseAzure)
- ResourceName: the name of your Azure OpenAI Resource (required).
- ApiKey: Azure OpenAI provides two methods for authentication. You can use either API Keys or Azure Active Directory (required).
- ApiVersion: the version of the API to use (optional). Allowed values:
- 2023-05-15
- 2023-06-01-preview
- 2023-10-01-preview
- 2024-02-01
- 2024-02-15-preview
- 2024-03-01-preview
- 2024-04-01-preview
- 2024-05-01-preview
- 2024-06-01
- 2024-07-01-preview
- 2024-08-01-preview
- 2024-09-01-preview
- 2024-10-01-preview
- 2024-10-21 (default)
- AuthenticationType: it specifies if the key is an actual API Key or an Azure Active Directory token (optional, default: "ApiKey").
DefaultModel and DefaultEmbeddingModel
ChatGPT can be used with different models for chat completion, both on OpenAI and Azure OpenAI service. With the DefaultModel property, you can specify the default model that will be used, unless you pass an explicit value in the AskAsync or AsyStreamAsync methods.
Even if it is not a strictly necessary for chat conversation, the library supports also the Embedding API, on both OpenAI and Azure OpenAI. As for chat completion, embeddings can be done with different models. With the DefaultEmbeddingModel property, you can specify the default model that will be used, unless you pass an explicit value in the GetEmbeddingAsync method.
OpenAI
Currently available models are:
- gpt-3.5-turbo,
- gpt-3.5-turbo-16k,
- gpt-4,
- gpt-4-32k
- gpt-4-turbo
- gpt-4o
- gpt-4o-mini
- o1-preview
- o1-mini
They have fixed names, available in the OpenAIChatGptModels.cs file.
Azure OpenAI Service
In Azure OpenAI Service, you're required to first deploy a model before you can make calls. When you deploy a model, you need to assign it a name, that must match the name you use with ChatGptNet.
Note Some models are not available in all regions. You can refer to Model Summary table and region availability page to check current availabilities.
Caching, MessageLimit and MessageExpiration
ChatGPT is aimed to support conversational scenarios: user can talk to ChatGPT without specifying the full context for every interaction. However, conversation history isn't managed by OpenAI or Azure OpenAI service, so it's up to us to retain the current state. By default, ChatGptNet handles this requirement using a MemoryCache that stores messages for each conversation. The behavior can be set using the following properties:
- MessageLimit: specifies how many messages for each conversation must be saved. When this limit is reached, oldest messages are automatically removed.
- MessageExpiration: specifies the time interval used to maintain messages in cache, regardless their count.
If necessary, it is possibile to provide a custom Cache by implementing the IChatGptCache interface and then calling the WithCache extension method:
public class LocalMessageCache : IChatGptCache
{
private readonly Dictionary<Guid, IEnumerable<ChatGptMessage>> localCache = new();
public Task SetAsync(Guid conversationId, IEnumerable<ChatGptMessage> messages, TimeSpan expiration, CancellationToken cancellationToken = default)
{
localCache[conversationId] = messages.ToList();
return Task.CompletedTask;
}
public Task<IEnumerable<ChatGptMessage>?> GetAsync(Guid conversationId, CancellationToken cancellationToken = default)
{
localCache.TryGetValue(conversationId, out var messages);
return Task.FromResult(messages);
}
public Task RemoveAsync(Guid conversationId, CancellationToken cancellationToken = default)
{
localCache.Remove(conversationId);
return Task.CompletedTask;
}
public Task<bool> ExistsAsync(Guid conversationId, CancellationToken cancellationToken = default)
{
var exists = localCache.ContainsKey(conversationId);
return Task.FromResult(exists);
}
}
// Registers the custom cache at application startup.
builder.Services.AddChatGpt(/* ... */).WithCache<LocalMessageCache>();
We can also set ChatGPT parameters for chat completion at startup. Check the official documentation for the list of available parameters and their meaning.
Configuration using an external source
The configuration can be automatically read from IConfiguration, using for example a ChatGPT section in the appsettings.json file:
"ChatGPT": {
"Provider": "OpenAI", // Optional. Allowed values: OpenAI (default) or Azure
"ApiKey": "", // Required
//"Organization": "", // Optional, used only by OpenAI
"ResourceName": "", // Required when using Azure OpenAI Service
"ApiVersion": "2024-10-21", // Optional, used only by Azure OpenAI Service (default: 2024-10-21)
"AuthenticationType": "ApiKey", // Optional, used only by Azure OpenAI Service. Allowed values: ApiKey (default) or ActiveDirectory
"DefaultModel": "my-model",
"DefaultEmbeddingModel": "text-embedding-ada-002", // Optional, set it if you want to use embedding
"MessageLimit": 20,
"MessageExpiration": "00:30:00",
"ThrowExceptionOnError": true // Optional, default: true
//"User": "UserName",
//"DefaultParameters": {
// "Temperature": 0.8,
// "TopP": 1,
// "MaxTokens": 500,
// "MaxCompletionTokens": null, // o1 series models support this property instead of MaxTokens
// "PresencePenalty": 0,
// "FrequencyPenalty": 0,
// "ResponseFormat": { "Type": "text" }, // Allowed values for Type: text (default) or json_object
// "Seed": 42 // Optional (any integer value)
//},
//"DefaultEmbeddingParameters": {
// "Dimensions": 1536
//}
}
And then use the corresponding overload of che AddChatGpt method:
// Adds ChatGPT service using settings from IConfiguration.
builder.Services.AddChatGpt(builder.Configuration);
Configuring ChatGptNet dinamically
The AddChatGpt method has also an overload that accepts an IServiceProvider as argument. It can be used, for example, if we're in a Web API and we need to support scenarios in which every user has a different API Key that can be retrieved accessing a database via Dependency Injection:
builder.Services.AddChatGpt((services, options) =>
{
var accountService = services.GetRequiredService<IAccountService>();
// Dynamically gets the API Key from the ser
Related Skills
openhue
328.6kControl Philips Hue lights and scenes via the OpenHue CLI.
sag
328.6kElevenLabs text-to-speech with mac-style say UX.
weather
328.6kGet current weather and forecasts via wttr.in or Open-Meteo
tweakcc
1.4kCustomize Claude Code's system prompts, create custom toolsets, input pattern highlighters, themes/thinking verbs/spinners, customize input box & user message styling, support AGENTS.md, unlock private/unreleased features, and much more. Supports both native/npm installs on all platforms.
