-
I am trying to setup Vercel AI SDK with an express endpoint which I will then use with After all my custom logic I have a This just has a |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 11 replies
-
You can use |
Beta Was this translation helpful? Give feedback.
-
Not sure if this helps, but the AI SDK uses the Node standard fetch. Have you tried setting the following in
More info: https://remix.run/docs/en/main/guides/single-fetch |
Beta Was this translation helpful? Give feedback.
I just figured this out. Just leaving the idea here for anyone else stumbling on this discussion looking for answers:
Copy-paste the file from this comment: #199 (comment)
Take the response from
openai.createChatCompletion()
and re-create it as:Then use the
new StreamingTextResponse(OpenAiStream(resp))
to create the stream that can be returned from remix-run action.