Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.Net: Bug: AzureOpenAIChatCompletionService.GetStreamingChatMessageContentsAsync() throwing exception with custom HttpMessageHandler #10197

Open
sergiocastelani opened this issue Jan 15, 2025 · 0 comments
Assignees
Labels
bug Something isn't working .NET Issue or Pull requests regarding .NET code

Comments

@sergiocastelani
Copy link

sergiocastelani commented Jan 15, 2025

Describe the bug
After migrating to version 1.32.0 (from 1.7.1) our code started throwing an

System.InvalidOperationException exception: Content stream position is not at beginning of stream

on the end of GetStreamingChatMessageContentsAsync() loop.

For some reason, reading the HTTP response beforehand, by using a custom HttpMessageHandler, makes the loop cleanup crash.

To Reproduce
I created a patch for the sample AzureOpenAI_ChatCompletionStreaming.cs (version 1.33.0-nightly-250110.1) to reproduce the problem:

diff --git a/dotnet/samples/Concepts/ChatCompletion/AzureOpenAI_ChatCompletionStreaming.cs b/dotnet/samples/Concepts/ChatCompletion/AzureOpenAI_ChatCompletionStreaming.cs
index 1ef364762..6b4fea7a7 100644
--- a/dotnet/samples/Concepts/ChatCompletion/AzureOpenAI_ChatCompletionStreaming.cs
+++ b/dotnet/samples/Concepts/ChatCompletion/AzureOpenAI_ChatCompletionStreaming.cs
@@ -7,6 +7,17 @@ using Microsoft.SemanticKernel.Connectors.OpenAI;
 
 namespace ChatCompletion;
 
+internal class MyMessageHandler : DelegatingHandler
+{
+    public MyMessageHandler() { InnerHandler = new HttpClientHandler(); }
+    protected override async Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
+    {
+        HttpResponseMessage response = await base.SendAsync(request, cancellationToken);
+        Console.WriteLine(await response.Content.ReadAsStringAsync(cancellationToken));
+        return response;
+    }
+}
+
 /// <summary>
 /// These examples demonstrate the ways different content types are streamed by Azure OpenAI via the chat completion service.
 /// </summary>
@@ -44,7 +55,9 @@ public class AzureOpenAI_ChatCompletionStreaming(ITestOutputHelper output) : Bas
             deploymentName: TestConfiguration.AzureOpenAI.ChatDeploymentName,
             endpoint: TestConfiguration.AzureOpenAI.Endpoint,
             apiKey: TestConfiguration.AzureOpenAI.ApiKey,
-            modelId: TestConfiguration.AzureOpenAI.ChatModelId);
+            modelId: TestConfiguration.AzureOpenAI.ChatModelId,
+            httpClient: new HttpClient(new MyMessageHandler())
+         );
 
         // Create chat history with initial system and user messages
         ChatHistory chatHistory = new("You are a librarian, an expert on books.");

Output:

Message: 
System.InvalidOperationException : Content stream position is not at beginning of stream.

Stack Trace: 
HttpClientTransportResponse.BufferContentSyncOrAsync(CancellationToken cancellationToken, Boolean async)
TaskExtensions.EnsureCompleted[T](ValueTask`1 task)
HttpClientTransportResponse.BufferContent(CancellationToken cancellationToken)
HttpClientTransportResponse.Dispose(Boolean disposing)
HttpClientTransportResponse.Dispose()
AsyncStreamingChatUpdateEnumerator.DisposeAsyncCore()
AsyncStreamingChatUpdateEnumerator.DisposeAsync()
InternalAsyncStreamingChatCompletionUpdateCollection.GetValuesFromPageAsync(ClientResult page)+MoveNext()
Boolean>.GetResult()
AsyncCollectionResult`1.GetAsyncEnumerator(CancellationToken cancellationToken)+MoveNext()
AsyncCollectionResult`1.GetAsyncEnumerator(CancellationToken cancellationToken)+MoveNext()
AsyncCollectionResult`1.GetAsyncEnumerator(CancellationToken cancellationToken)+MoveNext()
Boolean>.GetResult()
ClientCore.GetStreamingChatMessageContentsAsync(String targetModel, ChatHistory chatHistory, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)+MoveNext() line 282
ClientCore.GetStreamingChatMessageContentsAsync(String targetModel, ChatHistory chatHistory, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)+MoveNext() line 353
Boolean>.GetResult()
AzureOpenAI_ChatCompletionStreaming.StreamServicePromptTextAsync() line 68
AzureOpenAI_ChatCompletionStreaming.StreamServicePromptTextAsync() line 68
--- End of stack trace from previous location ---

Expected behavior
The streaming loop ends without exceptions

Platform

  • OS: Windows
  • IDE: Visual Studio
  • Language: C#
  • Source: version 1.33.-nightly-250110.1, main branch of repository

Additional context
The exception seems to come from the OpenAI-dotnet project. Specifically at the InternalAsyncStreamingChatCompletionUpdateCollection.cs.

@sergiocastelani sergiocastelani added the bug Something isn't working label Jan 15, 2025
@markwallace-microsoft markwallace-microsoft added .NET Issue or Pull requests regarding .NET code triage labels Jan 15, 2025
@github-actions github-actions bot changed the title Bug: AzureOpenAIChatCompletionService.GetStreamingChatMessageContentsAsync() throwing exception with custom HttpMessageHandler .Net: Bug: AzureOpenAIChatCompletionService.GetStreamingChatMessageContentsAsync() throwing exception with custom HttpMessageHandler Jan 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working .NET Issue or Pull requests regarding .NET code
Projects
Status: Bug
Development

No branches or pull requests

3 participants