System.ClientModel 1.1.0-beta.1
System.ClientModel library for .NET
System.ClientModel
contains building blocks for communicating with cloud services. It provides shared primitives, abstractions, and helpers for .NET service client libraries.
System.ClientModel
allows client libraries built from its components to expose common functionality in a consistent fashion, so that once you learn how to use these APIs in one client library, you'll know how to use them in other client libraries as well.
Getting started
Typically, you will not need to install System.ClientModel
.
it will be installed for you when you install one of the client libraries using it.
Install the package
Install the client library for .NET with NuGet.
dotnet add package System.ClientModel
Prerequisites
None needed for System.ClientModel
.
Authenticate the client
The System.ClientModel
package provides a KeyCredential
type for authentication.
Key concepts
The main shared concepts of System.ClientModel
include:
- Configuring service clients (
ClientPipelineOptions
). - Accessing HTTP response details (
ClientResult
,ClientResult<T>
). - Exceptions for reporting errors from service requests in a consistent fashion (
ClientResultException
). - Customizing requests (
RequestOptions
). - Providing APIs to read and write models in different formats (
ModelReaderWriter
).
Examples
Send a message using the MessagePipeline
A very basic client implementation might use the following approach:
ApiKeyCredential credential = new ApiKeyCredential(key);
ApiKeyAuthenticationPolicy authenticationPolicy = ApiKeyAuthenticationPolicy.CreateBearerAuthorizationPolicy(credential);
ClientPipeline pipeline = ClientPipeline.Create(pipelineOptions, authenticationPolicy);
PipelineMessage message = pipeline.CreateMessage();
message.Apply(requestOptions);
message.MessageClassifier = PipelineMessageClassifier.Create(stackalloc ushort[] { 200 });
PipelineRequest request = message.Request;
request.Method = "GET";
request.Uri = new Uri("https://www.example.com/");
request.Headers.Add("Accept", "application/json");
pipeline.Send(message);
Console.WriteLine(message.Response.Status);
Read and write persistable models
As a library author you can implement IPersistableModel<T>
or IJsonModel<T>
which will give library users the ability to read and write your models.
Example writing an instance of a model.
InputModel model = new InputModel();
BinaryData data = ModelReaderWriter.Write(model);
Example reading a model from json
string json = @"{
""x"": 1,
""y"": 2,
""z"": 3
}";
OutputModel? model = ModelReaderWriter.Read<OutputModel>(BinaryData.FromString(json));
Troubleshooting
You can troubleshoot System.ClientModel
-based clients by inspecting the result of any ClientResultException
thrown from a pipeline's Send
method.
Next steps
Contributing
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repositories using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
Showing the top 20 packages that depend on System.ClientModel.
Packages | Downloads |
---|---|
Azure.AI.OpenAI
Azure's official .NET library for OpenAI inference that supports Completions, Chat, and Embeddings.
Works with Azure OpenAI resources as well as the non-Azure OpenAI endpoint.
|
4 |
Azure.AI.OpenAI
Azure's official .NET library for OpenAI inference that supports Completions, Chat, and Embeddings.
Works with Azure OpenAI resources as well as the non-Azure OpenAI endpoint.
|
5 |
Azure.AI.OpenAI
Azure's official .NET library for OpenAI inference that supports Completions, Chat, and Embeddings.
Works with Azure OpenAI resources as well as the non-Azure OpenAI endpoint.
|
6 |
Azure.Core
This is the implementation of the Azure Client Pipeline
|
4 |
Azure.Core
This is the implementation of the Azure Client Pipeline
|
5 |
Azure.Core
This is the implementation of the Azure Client Pipeline
|
6 |
OpenAI
The official .NET library for the OpenAI service API.
|
3 |
OpenAI
This is the OpenAI client library for developing .NET applications with rich experience.
|
3 |
OpenAI
This is the OpenAI client library for developing .NET applications with rich experience.
|
4 |
OpenAI
This is the OpenAI client library for developing .NET applications with rich experience.
|
5 |
OpenAI
This is the OpenAI client library for developing .NET applications with rich experience.
|
6 |
.NET 6.0
- System.Memory.Data (>= 1.0.2)
- System.Text.Json (>= 4.7.2)
.NET Standard 2.0
- System.Memory.Data (>= 1.0.2)
- System.Text.Json (>= 4.7.2)
Version | Downloads | Last updated |
---|---|---|
1.2.1 | 3 | 19.10.2024 |
1.2.0 | 0 | 03.10.2024 |
1.1.0 | 4 | 21.09.2024 |
1.1.0-beta.7 | 4 | 21.09.2024 |
1.1.0-beta.6 | 4 | 21.09.2024 |
1.1.0-beta.5 | 7 | 21.09.2024 |
1.1.0-beta.4 | 2 | 21.09.2024 |
1.1.0-beta.3 | 5 | 21.09.2024 |
1.1.0-beta.2 | 6 | 21.09.2024 |
1.1.0-beta.1 | 4 | 21.09.2024 |
1.0.0 | 6 | 03.06.2024 |
1.0.0-beta.4 | 5 | 21.09.2024 |
1.0.0-beta.3 | 5 | 21.09.2024 |
1.0.0-beta.2 | 4 | 21.09.2024 |
1.0.0-beta.1 | 2 | 21.09.2024 |