Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming doesn't work properly in Blazor WASM #65

Closed
MoienTajik opened this issue Jun 16, 2024 · 4 comments
Closed

Streaming doesn't work properly in Blazor WASM #65

MoienTajik opened this issue Jun 16, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@MoienTajik
Copy link

There is a problem with streaming using IAsyncEnumerable in Blazor WASM. When consuming the stream in a console app or any environment other than Blazor WASM, the items are processed one by one as they are fetched. However, in Blazor WASM, the stream behaves differently, and the response is only available after all items are fetched from the remote resource.

The issue affects both ChatClient.CompleteChatStreamingAsync and AssistantClient.CreateRunStreamingAsync methods (and potentially any other streaming methods).

Steps to reproduce:

  1. Set up a Blazor WASM project.
  2. Use the following code snippet to fetch streaming updates from a remote resource.
  3. Observe the behavior difference compared to a console app.

Code:

var messageContent = MessageContent.FromText(prompt);

var streamingUpdates = assistantClient.CreateRunStreamingAsync(
    thread.Id,
    assistant.Id,
    new()
    {
        AdditionalMessages = { new([messageContent]) }
    }
);

await foreach (var streamingUpdate in streamingUpdates)
{
    switch (streamingUpdate.UpdateKind)
    {
        case StreamingUpdateReason.MessageCreated:
            Console.WriteLine($"Message created: {DateTimeOffset.Now:O}");
            break;

        case StreamingUpdateReason.MessageUpdated when streamingUpdate is MessageContentUpdate messageContent:
            Console.WriteLine($"Message updated: {messageContent.Text} -- {DateTimeOffset.Now:O}");
            break;
    }
}

Expected Behavior:

Each item should be processed as it is fetched from the remote resource, similar to the behavior observed in a console app.

Actual Behavior:

In Blazor WASM, the stream processes all items only after they are completely fetched, rather than one by one.

Environment:

  • Blazor WASM on .NET 8
  • OpenAI NuGet package version: 2.0.0-beta.5

Additional Information:

The problem is detailed in this blog post. The behavior discrepancy between Blazor WASM and other environments needs to be addressed to ensure consistent streaming functionality by setting SetBrowserResponseStreamingEnabled(true) and HttpCompletionOption.ResponseHeadersRead on the HttpRequestMessage.

@trrwilson trrwilson added the bug Something isn't working label Jun 17, 2024
@KrzysztofCwalina
Copy link
Collaborator

@annelo-msft, we need allow setting/calling HttpRequestMessage.SetBrowserResponseStreamingEnabled(true) on HttpClient's messages to fix this problem.

@annelo-msft
Copy link
Contributor

Tracking with Azure/azure-sdk-for-net#44706

@annelo-msft
Copy link
Contributor

@KrzysztofCwalina, I believe we may already have the method we need on HttpClientPipelineTransport in the OnSendingRequest method.

For example, following the repro steps above, I am able to create a sample project and add a sample transport implementation that overrides OnSendingRequest:

public class BlazorHttpClientTransport : HttpClientPipelineTransport
{
    protected override void OnSendingRequest(PipelineMessage message, HttpRequestMessage httpRequest)
    {
        httpRequest.SetBrowserResponseStreamingEnabled(true);
    }
}

The default SCM transport already passes HttpCompletionOption.ResponseHeadersRead to HttpClient.Send, so I believe that requirement should be addressed without any additional customization to the transport implementation.

That can then be added to the client by passing an instance of OpenAIClientOptions as follows:

OpenAIClientOptions options = new();
options.Transport = new BlazorHttpClientTransport();
ChatClient client = new(model: "gpt-4o", apiKey, options);

If this doesn't address the issue as you were thinking @KrzysztofCwalina, let me know.

Or @MoienTajik, if this doesn't address the problem you're seeing, I'm happy to dig further into needs here.

Thanks!

@MoienTajik
Copy link
Author

Thanks, Anne, for investigating this! I can confirm that this works and solves the problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants