Feeding tool result back into messages AND streaming the final summary #1723
-
Thanks to #1574, I am able to feed tool results back into messages to output a summary. I now want to be able to stream the result back as a response (I am not using This is the code I tried: async function queryLlm(): Promise<ReadableStream> {
// ...
let messages: CoreMessage[] = [
{ role: "user", content: "Hi!" },
{ role: "assistant", content: "Hello, how can I help?" },
{
role: "user",
content: "What's the weather in New York?",
},
];
let textResultIsPresent = false;
let textStreamResult: ReadableStream<string> | undefined;
while (!textResultIsPresent) {
await sleep(2000);
const { textStream, toolCalls, toolResults } = await streamText({
model,
tools: {
weather: tool({
description: "Get the weather in a location",
parameters: z.object({
location: z
.string()
.describe("The location to get the weather for"),
}),
execute: async ({ location }) => ({
location,
temperature: 72 + Math.floor(Math.random() * 21) - 10,
}),
}),
},
messages,
});
if (toolResults.length > 0 && toolCalls.length > 0) {
messages.push({
role: "assistant" as const,
content: toolCalls,
});
messages.push({
role: "tool" as const,
content: toolResults,
});
} else {
textStreamResult = textStream;
textResultIsPresent = true;
}
console.log(JSON.stringify(messages, null, 2));
}
return textStream;
} But the response is never returned, presumably because Is this a correct way to do tool calling + streaming (without using useChat)? |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 3 replies
-
The statement is stuck in ![]() |
Beta Was this translation helpful? Give feedback.
-
You can enable tool call roundtrips with:
|
Beta Was this translation helpful? Give feedback.
-
@zineanteoh Were you able to find a workaround for this while we await streamText support? |
Beta Was this translation helpful? Give feedback.
You can enable tool call roundtrips with: