Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace node-fetch with undici #392

Open
1 task done
Ethan-Arrowood opened this issue Oct 20, 2023 · 8 comments · Fixed by gptscript-ai/gpt4-v-vision#4 · May be fixed by #402
Open
1 task done

Replace node-fetch with undici #392

Ethan-Arrowood opened this issue Oct 20, 2023 · 8 comments · Fixed by gptscript-ai/gpt4-v-vision#4 · May be fixed by #402

Comments

@Ethan-Arrowood
Copy link

Confirm this is a feature request for the Node library and not the underlying OpenAI API.

  • This is a feature request for the Node library

Describe the feature or improvement you're requesting

I noticed this library is still using node-fetch because Node's native fetch is considered experimental.

I think it'd be in the libraries best interest to switch to undici instead. Undici is the fetch implementation in Node.js. For all intents and purposes it is stable (nodejs/undici#1737). We (the maintainers of Undici) have some concerns about marking it as such in Node.js just yet because of the nature of the Fetch api spec (it itself adds breaking changes occasionally - this doesn't fit well with Node.js versioning strategy. It's complicated - read the issue I linked above for more details).

Switching the undici for the shim will enable a significantly easier upgrade path in the future whenever we figure out how to mark it as properly stable in Node.js

Happy to help swap this out too if the maintainers approve 😄 🚀

Additional context

No response

@rattrayalex
Copy link
Collaborator

Hi @Ethan-Arrowood,

Thanks for raising this, it's something we'd be open to – and I appreciate the offer of help.

There are a few related packages we rely on in the node-fetch ecosystem we rely on as well; could you help advise on replacements?

They are:

  1. agentkeepalive for reusing connections
  2. formdata-node and form-data-encoder for multipart file uploads
  3. abort-controller (I assume this will Just Work with undici?)

@Ethan-Arrowood
Copy link
Author

  1. Undici's agent is keep-alive by default 😄 and completely customizable too.
  2. We have our own spec compliant formdata implementation directly in undici
  3. Native AbortController just works 😄

@rattrayalex
Copy link
Collaborator

rattrayalex commented Oct 20, 2023

Great!

  1. Do you know of docs or a guide on how to configure your keep-alive agent to disable Nagle's algorithm and get other optimizations from agentkeepalive? I'm sure we can figure this out ourselves, would just save some eng time.
  2. That's great to hear. Do you simply export FormData classes and similar? I'm not immediately finding that in the docs. What about Blob and File classes? (Right now we have to use a third party dep for those as well).
  3. Great, looks like native AbortController is supported in all non-EOL Node versions now (and has been for a while - I was just behind the times).

If you'd be willing to take a crack at porting this, it'd certainly be tremendously appreciated. I think the toughest part will be our file uploads machinery (especially mucking without cross-environment support). We have pretty extensive tests (look for ecosystem-tests). You're welcome to ping by email with any questions.

If that proves to be too big an ask, we can add this to our roadmap, it does sound like a good thing to do.

@Ethan-Arrowood
Copy link
Author

Yeah apologies our docs aren't ideal. There is a ton of extensibility though.

  1. https://undici.nodejs.org/#/docs/api/Agent is our docs on the Agent class. Combine this with this https://undici.nodejs.org/#/?id=undicisetglobaldispatcherdispatcher method and you can customize the Agent that all fetch calls will use by default. I can't remember exactly how this would play if a downstream user overwrites it though with their own call to setGlobalDispatcher so the custom agent can also be passed to the fetch call directly via the dispatcher option (https://github.com/nodejs/undici/blob/63afd9b5e28380dc86de29ade69adaad7efcd231/types/fetch.d.ts#L117).
  2. Yes we do! All those exist. Yeah the docs don't make that obvious (wow I really gotta fix that 😅 ) but here is the code path for File and FormData: https://github.com/nodejs/undici/blob/main/index.js#L119-L120 and Blob is non-experimental in Node since v18 https://nodejs.org/api/buffer.html#class-blob
  3. 🎉

Happy to do so! Let me take a swing at things next week.

I'll also say, are you strictly tied to using Fetch for some sort of interim reasons? Or is the Node.js part of this distinct enough that you could use a better networking client? Undici itself has much better APIs such as .request and .stream and .pipeline that deliver significantly faster throughput than Fetch. The library is an official Node.js project and is the recommended HTTP client for all new Node.js projects. Maintainers include Node.js TSC members and long time contributors.

@rattrayalex
Copy link
Collaborator

Interesting, I didn't know that undici offered a different, faster request interface as well.

There are a few considerations I'd have:

  1. We currently use native fetch on platforms where it's available, partly because node-fetch wouldn't work in non-node envs but partly because some runtimes may desire/expect/require native fetch to be used (for example, I believe Vercel Edge Runtime manipulates outgoing requests). I don't feel strongly about doing this but if we were to move away from native fetch by default on these platforms, it would have to be a very well-researched decision with few to no tradeoffs.
  2. We accept a user-provided fetch function in client instantiation, and we'd want to continue to do so.

I'd be a little worried about bifurcating our request pathways between "env-or-user-provided fetch" or "built-in undici .request et al", but if it's what you'd recommend after considering the above and exploring the codebase, I'm certainly open to it!

@Ethan-Arrowood
Copy link
Author

Let's start with switching to undici fetch. Would love to dig in deeper in the future. I think it's an interesting problem. This is exactly what fetch in node exists for - interoperability. I think you have an amazing example of cross environment javascript sdk 😁

@rattrayalex
Copy link
Collaborator

Thanks Ethan! Very excited to see what you come up with!

@Ethan-Arrowood Ethan-Arrowood linked a pull request Oct 24, 2023 that will close this issue
@shoyuf
Copy link

shoyuf commented Oct 25, 2023

Undici fetch does look better! 🎉

My reverse proxy for api.openai.com has a default open setting response buffer, which takes some time to buffer data on HTTP/1.1. This is not ideal for streaming output of completion data.

I spent a lot of time trying to figure out this problem.

I don't know how to make node-fetch support HTTP/2, so changing to undici is a better idea.

njhale added a commit to gptscript-ai/gpt4-v-vision that referenced this issue Mar 4, 2024
The openai package transitively depends on an outdated version of
whatwg-url, which causes nodejs to print a deprecation warning for
punycode when running the vision tool. This will be fixed when openai
resolves openai/openai-node#392. Until then,
override the version of whatwg-url in order to prevent the deprecation
warning.

See openai/openai-node#527 (comment)
for details on this stopgap.

Signed-off-by: Nick Hale <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants