-
Notifications
You must be signed in to change notification settings - Fork 528
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: dispatch #300
feat: dispatch #300
Conversation
README.md
Outdated
undici - request x 11,949 ops/sec ±0.99% (85 runs sampled) | ||
undici - stream x 12,223 ops/sec ±0.76% (85 runs sampled) | ||
http - keepalive x 5,392 ops/sec ±6.66% (68 runs sampled) | ||
undici - pipeline x 7,055 ops/sec ±3.54% (79 runs sampled) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pipeline performance regression is from a previous commit. Haven't bisected yet.
The new API looks good IMO. Instead of returning
Just an idea. |
That's kind of nice. Less magical. |
35c1454
to
62c2ac4
Compare
Another alternative is something like: _onConnect(controller) Where controller has the following methods:
Which would resolve the async resource issue as well... at the cost of an extra object allocation. EDIT: the runInAsyncScope won't work since we would need it before _onConnect. |
625ad12
to
2b597cd
Compare
@mcollina I think perf diff is within margin of error now... |
077bf58
to
36de249
Compare
master: undici - pipeline x 7,225 ops/sec ±1.49% (273 runs sampled)
undici - request x 11,760 ops/sec ±0.90% (277 runs sampled)
undici - stream x 12,568 ops/sec ±0.79% (280 runs sampled)
undici - simple x 12,949 ops/sec ±0.59% (278 runs sampled)
undici - noop x 19,087 ops/sec ±1.19% (259 runs sampled) dispatch2: undici - pipeline x 7,281 ops/sec ±1.68% (273 runs sampled)
undici - request x 11,700 ops/sec ±0.56% (278 runs sampled)
undici - stream x 12,684 ops/sec ±0.72% (280 runs sampled)
undici - dispatch x 13,446 ops/sec ±0.37% (276 runs sampled)
undici - noop x 19,590 ops/sec ±1.06% (255 runs sampled) |
3153e7e
to
30146e4
Compare
4f82850
to
367080b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
Makes our internal API public. This is the API which our public API is implemented on top.
There is need for some internal cleanup and refactor post this PR. I've tried to keep the changes as small as possible for easy of review.
I am not 100% satisfied with the API. However, I believe it does fullfil functional and performance requirements. Would appreciate some feedback and maybe ideas. I suspect this API might evolve/improve over time and be less stable than the other ones.
Some problems that might or might not need to be resolved:
There is now twoAsyncResource
s, thehandler
object andRequest
object. Depending on whether the implementation needs to asynchronously propagate async scope.It's a bit weird that_onConnect
takes aresume
function.Not sure about naming for_onConnect
.It's a bit weird that
_onData
should returnnull
to indicate an aborted message.Refs: #293