OpenAI 1.5.0
C#/.NET SDK for accessing the OpenAI GPT-3 API
A simple C# .NET wrapper library to use with OpenAI's GPT-3 API. More context on my blog.
Quick Example
var api = new OpenAI_API.OpenAIAPI("YOUR_API_KEY");
var result = await api.Completions.GetCompletion("One Two Three One Two");
Console.WriteLine(result);
// should print something starting with "Three"
Readme
- Status
- Requirements
- Installation
- Authentication
- Completions API
- Embeddings API
- Files API
- Additonal Documentation
- License
Status
Updated to work with the current API as of February 16, 2023. Fixed several minor bugs.
Now also should work with the Azure OpenAI Service, although this is untested. See Azure section for further details.
Thank you @GotMike, @ncface, @KeithHenry, @gmilano, @metjuperry, and @Alexei000 for your contributions!
Requirements
This library is based on .NET Standard 2.0, so it should work across .NET Framework >=4.7.2 and .NET Core >= 3.0. It should work across console apps, winforms, wpf, asp.net, etc (although I have not yet tested with asp.net). It should work across Windows, Linux, and Mac, although I have only tested on Windows so far.
Getting started
Install from NuGet
Install package OpenAI
from Nuget. Here's how via commandline:
Install-Package OpenAI
Authentication
There are 3 ways to provide your API keys, in order of precedence:
- Pass keys directly to
APIAuthentication(string key)
constructor - Set environment var for OPENAI_API_KEY (or OPENAI_KEY for backwards compatibility)
- Include a config file in the local directory or in your user directory named
.openai
and containing the line:
OPENAI_API_KEY=sk-aaaabbbbbccccddddd
You use the APIAuthentication
when you initialize the API as shown:
// for example
OpenAIAPI api = new OpenAIAPI("YOUR_API_KEY"); // shorthand
// or
OpenAIAPI api = new OpenAIAPI(new APIAuthentication("YOUR_API_KEY")); // create object manually
// or
OpenAIAPI api = new OpenAIAPI(APIAuthentication LoadFromEnv()); // use env vars
// or
OpenAIAPI api = new OpenAIAPI(APIAuthentication LoadFromPath()); // use config file (can optionally specify where to look)
// or
OpenAIAPI api = new OpenAIAPI(); // uses default, env, or config file
You may optionally include an openAIOrganization (OPENAI_ORGANIZATION in env or config file) specifying which organization is used for an API request. Usage from these API requests will count against the specified organization's subscription quota. Organization IDs can be found on your Organization settings page.
// for example
OpenAIAPI api = new OpenAIAPI(new APIAuthentication("YOUR_API_KEY","org-yourOrgHere"));
Completions
The Completion API is accessed via OpenAIAPI.Completions
:
async Task<CompletionResult> CreateCompletionAsync(CompletionRequest request);
// for example
var result = await api.Completions.CreateCompletionAsync(new CompletionRequest("One Two Three One Two", model: Model.CurieText, temperature: 0.1));
// or
var result = await api.Completions.CreateCompletionAsync("One Two Three One Two", temperature: 0.1);
// or other convenience overloads
You can create your CompletionRequest
ahead of time or use one of the helper overloads for convenience. It returns a CompletionResult
which is mostly metadata, so use its .ToString()
method to get the text if all you want is the completion.
Streaming
Streaming allows you to get results are they are generated, which can help your application feel more responsive, especially on slow models like Davinci.
Using the new C# 8.0 async iterators:
IAsyncEnumerable<CompletionResult> StreamCompletionEnumerableAsync(CompletionRequest request);
// for example
await foreach (var token in api.Completions.StreamCompletionEnumerableAsync(new CompletionRequest("My name is Roger and I am a principal software engineer at Salesforce. This is my resume:", Model.DavinciText, 200, 0.5, presencePenalty: 0.1, frequencyPenalty: 0.1)))
{
Console.Write(token);
}
Or if using classic .NET framework or C# <8.0:
async Task StreamCompletionAsync(CompletionRequest request, Action<CompletionResult> resultHandler);
// for example
await api.Completions.StreamCompletionAsync(
new CompletionRequest("My name is Roger and I am a principal software engineer at Salesforce. This is my resume:", Model.DavinciText, 200, 0.5, presencePenalty: 0.1, frequencyPenalty: 0.1),
res => ResumeTextbox.Text += res.ToString());
Embeddings
The Embedding API is accessed via OpenAIAPI.Embeddings
:
async Task<EmbeddingResult> CreateEmbeddingAsync(EmbeddingRequest request);
// for example
var result = await api.Embeddings.CreateEmbeddingAsync(new EmbeddingRequest("A test text for embedding", model: Model.AdaTextEmbedding));
// or
var result = await api.Embeddings.CreateEmbeddingAsync("A test text for embedding");
The embedding result contains a lot of metadata, the actual vector of floats is in result.Data[].Embedding.
For simplicity, you can directly ask for the vector of floats and disgard the extra metadata with api.Embeddings.GetEmbeddingsAsync("test text here")
Files (for fine-tuning)
The Files API endpoint is accessed via OpenAIAPI.Files
:
// uploading
async Task<File> UploadFileAsync(string filePath, string purpose = "fine-tune");
// for example
var response = await api.Files.UploadFileAsync("fine-tuning-data.jsonl");
Console.Write(response.Id); //the id of the uploaded file
// listing
async Task<List<File>> GetFilesAsync();
// for example
var response = await api.Files.GetFilesAsync();
foreach (var file in response)
{
Console.WriteLine(file.Name);
}
There are also methods to get file contents, delete a file, etc.
The fine-tuning endpoint itself has not yet been implemented, but will be added soon.
Azure
For using the Azure OpenAI Service, you need to specify the name of your Azure OpenAI resource as well as your model deployment id. Additionally you may specify the Api version which defaults to 2022-12-01
.
I do not have access to the Microsoft Azure OpenAI service, so I am unable to test this functionality. If you have access and can test, please submit an issue describing your results. A PR with integration tests would also be greatly appreciated. Specifically, it is unclear to me that specifying models works the same way with Azure.
Refer the Azure OpenAI documentation for further information.
Configuration should look something like this for the Azure service:
OpenAIAPI api = OpenAIAPI.ForAzure("YourResourceName", "deploymentId", "api-key");
You may then use the api
object like normal. You may also specify the APIAuthentication
is any of the other ways listed in the Authentication section above. Currently this library only supports the api-key flow, not the AD-Flow.
Documentation
Every single class, method, and property has extensive XML documentation, so it should show up automatically in IntelliSense. That combined with the official OpenAI documentation should be enough to get started. Feel free to open an issue here if you have any questions. Better documentation may come later.
License
This library is licensed CC-0, in the public domain. You can use it for whatever you want, publicly or privately, without worrying about permission or licensing or whatever. It's just a wrapper around the OpenAI API, so you still need to get access to OpenAI from them directly. I am not affiliated with OpenAI and this library is not endorsed by them, I just have beta access and wanted to make a C# library to access it more easily. Hopefully others find this useful as well. Feel free to open a PR if there's anything you want to contribute.
Showing the top 20 packages that depend on OpenAI.
Packages | Downloads |
---|---|
Azure.AI.OpenAI
Azure OpenAI's official extension package for using OpenAI's .NET library with the Azure OpenAI Service.
|
4 |
Azure.AI.OpenAI
Azure OpenAI's official extension package for using OpenAI's .NET library with the Azure OpenAI Service.
|
5 |
Azure.AI.OpenAI
Azure OpenAI's official extension package for using OpenAI's .NET library with the Azure OpenAI Service.
|
6 |
.NET Standard 2.0
- Microsoft.Bcl.AsyncInterfaces (>= 1.1.1)
- Newtonsoft.Json (>= 13.0.2)
Version | Downloads | Last updated |
---|---|---|
2.1.0-beta.2 | 0 | 04.11.2024 |
2.1.0-beta.1 | 3 | 16.10.2024 |
2.0.0 | 2 | 28.10.2024 |
2.0.0-beta.13 | 0 | 27.09.2024 |
2.0.0-beta.12 | 6 | 21.09.2024 |
2.0.0-beta.11 | 4 | 22.09.2024 |
2.0.0-beta.10 | 2 | 28.10.2024 |
2.0.0-beta.9 | 0 | 24.08.2024 |
2.0.0-beta.8 | 6 | 22.09.2024 |
2.0.0-beta.7 | 0 | 24.06.2024 |
2.0.0-beta.6 | 3 | 20.10.2024 |
2.0.0-beta.5 | 3 | 19.10.2024 |
2.0.0-beta.4 | 2 | 30.10.2024 |
2.0.0-beta.3 | 5 | 22.09.2024 |
2.0.0-beta.2 | 0 | 06.06.2024 |
2.0.0-beta.1 | 6 | 22.09.2024 |
1.11.0 | 0 | 13.03.2024 |
1.10.0 | 0 | 14.12.2023 |
1.9.0 | 3 | 19.10.2024 |
1.8.0 | 0 | 06.12.2023 |
1.7.2 | 2 | 28.10.2024 |
1.7.1 | 6 | 21.09.2024 |
1.7.0 | 5 | 21.09.2024 |
1.6.0 | 5 | 21.09.2024 |
1.5.0 | 4 | 21.09.2024 |
1.4.0 | 3 | 21.09.2024 |
1.3.0 | 5 | 21.09.2024 |
1.2.0 | 4 | 21.09.2024 |
1.1.0 | 4 | 21.09.2024 |
1.0.0 | 4 | 21.09.2024 |