Streaming Parallel Chat Completions #2084
Unanswered
mosnicholas
asked this question in
Help
Replies: 2 comments 1 reply
-
cc: @lgrammel @MaxLeiter curious if this is possible? :) |
Beta Was this translation helpful? Give feedback.
0 replies
-
Yes! I wrote an article on how to do parallel streams: https://mikecavaliere.com/posts/multiple-parallel-streams-vercel-ai-sdk Here's an example I'm running locally, where I'm running two models in parallel with the same prompt: // src/app/action.tsx
"use server";
import { createStreamableValue } from "ai/rsc";
import { CoreMessage, streamText, StreamData } from "ai";
import { anthropic } from "@ai-sdk/anthropic";
import { openai } from "@ai-sdk/openai";
export async function continueConversation(messages: CoreMessage[]) {
const anthropicResult = await streamText({
model: anthropic("claude-3-5-sonnet-20240620"),
messages,
});
const openaiResult = await streamText({
model: openai("gpt-4o"),
messages,
});
// Accumulated value; value gets appended to with time.
const anthropicStream = createStreamableValue(anthropicResult.textStream);
const openaiStream = createStreamableValue(openaiResult.textStream);
return {
anthropicStream: anthropicStream.value,
openaiStream: openaiStream.value,
};
} Then in the client I have something like this. One <form
onSubmit={async (e) => {
e.preventDefault();
// Add the typed message to state
const newMessages: CoreMessage[] = [
...messages,
{ content: input, role: "user" },
];
setMessages(newMessages);
setInput("");
// Send the current chat history to the server & LLM
const result = await continueConversation(newMessages);
(async function () {
// Stream each returned message
for await (const content of readStreamableValue(
result.anthropicStream
)) {
setMessagesAnthropic([
...newMessages,
{
role: "assistant",
content: content as string,
},
]);
}
})();
(async function () {
for await (const content of readStreamableValue(
result.openaiStream
)) {
setMessagesOpenai([
...newMessages,
{
role: "assistant",
content: content as string,
},
]);
}
})();
}}
>
<input
className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
value={input}
placeholder="Say something..."
onChange={(e) => {
setInput(e.target.value);
}}
/>
</form> |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
OpenAI has a developed a really cool mechanism to gather feedback from users -- stream multiple completions in parallel, and ask users to pick the better one.
Is there a way to accomplish this with the
ai
package -- eg. stream two different OpenAI model responses, or OpenAI vs. Anthropic, and get feedback on which is better for our users? How could we accomplish it?Reference:
![image](https://private-user-images.githubusercontent.com/3606778/342574893-3f27fa40-b620-4369-be08-c1bab8886664.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjA3MzE5ODAsIm5iZiI6MTcyMDczMTY4MCwicGF0aCI6Ii8zNjA2Nzc4LzM0MjU3NDg5My0zZjI3ZmE0MC1iNjIwLTQzNjktYmUwOC1jMWJhYjg4ODY2NjQucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI0MDcxMSUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNDA3MTFUMjEwMTIwWiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9NzM3NjljOWM5MWYxY2Y5NWUyMTYyMTUzOGVhZWI5YjhjNDllY2FlOTA1YWI3OThiYzI5YzUzZDJiZjhkZWE5MyZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QmYWN0b3JfaWQ9MCZrZXlfaWQ9MCZyZXBvX2lkPTAifQ.KFMoVifdaGLcxXLQsG42Biqx5ULO_BxwkTn7isIY2Qg)
Beta Was this translation helpful? Give feedback.
All reactions