diff --git a/sdk/ai/Azure.AI.Inference/MIGRATION.md b/sdk/ai/Azure.AI.Inference/MIGRATION.md new file mode 100644 index 000000000000..c428d7470f54 --- /dev/null +++ b/sdk/ai/Azure.AI.Inference/MIGRATION.md @@ -0,0 +1,97 @@ +## Migrate from `Azure.AI.Inference` to `OpenAI` + +Following the `2025-05-01-preview` service API version, Model Inference has converged its capabilities with OpenAI-compatible API surfaces available with Azure AI Foundry. As part of this convergence, the `Azure.AI.Inference` library is discontinued in favor of support via the official `OpenAI` library. + +This document describes the migration steps from common model inference scenarios to equivalent OpenAI operations. + +## Client configuration + +**Before, using `Azure.AI.Inference`:** + +```csharp +using Azure.AI.Inference; + +Uri endpoint = new(Environment.GetEnvironmentVariable("AZURE_AI_CHAT_ENDPOINT")); +AzureKeyCredential credential = new(System.Environment.GetEnvironmentVariable("AZURE_AI_CHAT_KEY")); + +// Chat +ChatCompletionsClient client = new(endpoint, credential); + +// Embeddings +EmbeddingsClient embeddingsClient = new(endpoint, credential, new AzureAIInferenceClientOptions()); +``` + +**After, using `OpenAI`:** + +```csharp +using OpenAI.Chat; + +Uri endpoint = new($"{Environment.GetEnvironmentVariable("AI_FOUNDRY_ENDPOINT")}/openai/v1")); +ApiKeyCredential credential = new(Environment.GetEnvironmentVariable("AI_FOUNDRY_API_KEY")); +OpenAIClient openAIClient = new( + credential, + new OpenAIClientOptions() + { + Endpoint = endpoint, + }); + +// Chat +ChatClient client = openAIClient.GetChatClient(Environment.GetEnvironmentVariable("AI_FOUNDRY_MODEL_DEPLOYMENT")); + +// Embeddings +EmbeddingClient embeddingClient = openAIClient.GetEmbeddingClient(Environment.GetEnvironmentVariable("AI_FOUNDRY_MODEL_DEPLOYMENT")); +``` + +## Use Chat Completions + +**Before, with `Azure.AI.Inference`:** + +```csharp +Response response = client.Complete( + new ChatCompletionsOptions() + { + Messages = + { + new ChatRequestSystemMessage("You are a helpful assistant."), + new ChatRequestUserMessage("How many feet are in a mile?"), + } + }); +Console.WriteLine(response.Value.Content); +``` + +**After, with `OpenAI`:** + +```csharp +ChatCompletion completion = client.CompleteChat( + [ + new SystemChatMessage("You are a helpful assistant."), + new UserChatMessage("How many feet are in a mile?"), + ]); +Console.WriteLine(completion.Content[0].Text); +``` + +## Use Text Embeddings + +**Before, with `Azure.AI.Inference`:** + +```csharp +EmbeddingsOptions options = new( + new List { "King", "Queen", "Jack", "Page" }); +Response response = client.Embed(options); +foreach (EmbeddingItem item in response.Value.Data) +{ + List embedding = item.Embedding.ToObjectFromJson>(); + Console.WriteLine($"Index: {item.Index}, Embedding: <{string.Join(", ", embedding)}>"); +} +``` + +**After, with `OpenAI`:** + +```csharp +OpenAIEmbedding embedding = client.GenerateEmbedding(["King", "Queen", "Jack", "Page"]); + +foreach (float embeddingValue in embedding.ToFloats()) +{ + Console.WriteLine(embeddingValue); +} +``` \ No newline at end of file diff --git a/sdk/ai/Azure.AI.Inference/README.md b/sdk/ai/Azure.AI.Inference/README.md index f2ff479b219b..85e16d029fb0 100644 --- a/sdk/ai/Azure.AI.Inference/README.md +++ b/sdk/ai/Azure.AI.Inference/README.md @@ -1,5 +1,8 @@ # Azure Inference client library for .NET +> [!NOTE] +> The `Azure.AI.Inference` library is now deprecated. Going forward, the official `OpenAI` library is the successor to this package. For more details, see [the migration guide](MIGRATION.md). + The client library (in preview) does inference, including chat completions, for AI models deployed by [Azure AI Foundry](https://ai.azure.com) and [Azure Machine Learning Studio](https://ml.azure.com/). It supports Serverless API endpoints and Managed Compute endpoints (formerly known as Managed Online Endpoints). The client library makes services calls using REST API version `2024-05-01-preview`, as documented in [Azure AI Model Inference API](https://learn.microsoft.com/azure/ai-studio/reference/reference-model-inference-api). For more information see [Overview: Deploy AI models in Azure AI Foundry portal](https://learn.microsoft.com/azure/ai-studio/concepts/deployments-overview). Use the model inference client library to: