Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use llhttp wasm build #575

Closed
wants to merge 10 commits into from
Closed

Use llhttp wasm build #575

wants to merge 10 commits into from

Conversation

dnlup
Copy link
Contributor

@dnlup dnlup commented Feb 27, 2021

In this PR I'll try to:

  • setup a branch in https://github.com/dnlup/llhttp/tree/undici_wasm which defines some npm build scripts to make a wasm build
  • setup a script to run the build in this repo and save the build output
  • implement a minimal parser that is compatible to use it as a base class in undici parser (I am still undecided if it should live in llhttp or here)
  • test that the parser is a drop-in replacement (hopefully) and performance degradation is minimal

This is still messy, I'll get there eventually.

Final tasks:

  • implement lenient option
  • implement headersTimeout option
  • implement maxHeadersSize option

Fixes #22

Copy link
Member

@ronag ronag left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You only need to implement RESPONSE for client.

lib/llhttp/parser.js Outdated Show resolved Hide resolved
lib/llhttp/parser.js Outdated Show resolved Hide resolved
@mcollina
Copy link
Member

The goal for this activity should be to verify that we suffer minimal slowdowns from using wasm instead of native.

@ronag
Copy link
Member

ronag commented Feb 28, 2021

We also probably want to reduce the number of calls across the js <-> wasm border so we might want to modify the c level code as well. Let's just get it working and passing tests first and then we optimize.

@mcollina
Copy link
Member

I agree!

lib/llhttp/parser.js Outdated Show resolved Hide resolved
@codecov-io
Copy link

codecov-io commented Feb 28, 2021

Codecov Report

Merging #575 (66af633) into master (3d002c1) will decrease coverage by 1.36%.
The diff coverage is 92.71%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #575      +/-   ##
==========================================
- Coverage   99.36%   97.99%   -1.37%     
==========================================
  Files          16       17       +1     
  Lines        1414     1547     +133     
==========================================
+ Hits         1405     1516     +111     
- Misses          9       31      +22     
Impacted Files Coverage Δ
lib/llhttp/parser.js 92.36% <92.36%> (ø)
lib/core/client.js 98.76% <100.00%> (-0.38%) ⬇️
lib/llhttp/constants.js 100.00% <100.00%> (ø)
lib/core/util.js 88.15% <0.00%> (-7.90%) ⬇️
lib/core/request.js 96.93% <0.00%> (-3.07%) ⬇️
lib/node/http-parser.js

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 3d002c1...66af633. Read the comment docs.

@dnlup
Copy link
Contributor Author

dnlup commented Feb 28, 2021

If you have time could you try npm run build on mac? I only have a Linux box.

@dnlup
Copy link
Contributor Author

dnlup commented Feb 28, 2021

The llhttp repo is cloned using ssh.

@dnlup
Copy link
Contributor Author

dnlup commented Feb 28, 2021

Rebased against #571. I'll fix tests iteratively next.

@ronag ronag mentioned this pull request Feb 28, 2021
lib/llhttp/parser.js Outdated Show resolved Hide resolved
@dnlup
Copy link
Contributor Author

dnlup commented Mar 2, 2021

I was able to run a benchmark with the current changes, even if not complete:

Node version: 14.16.0

current:

> [email protected] bench /home/dnlup/Dev/dnlup/undici
> npx concurrently -k -s first "node benchmarks/server.js" "node -e 'setTimeout(() => {}, 1000)' && node benchmarks"

[1] http - no agent  x 402 ops/sec ±3.49% (61 runs sampled)
[1] http - keepalive x 453 ops/sec ±2.94% (59 runs sampled)
[1] http - keepalive - multiple sockets x 3,191 ops/sec ±4.19% (72 runs sampled)
[1] undici - pipeline x 3,702 ops/sec ±3.84% (69 runs sampled)
[1] undici - request x 3,572 ops/sec ±3.56% (66 runs sampled)
[1] undici - pool - request - multiple sockets x 3,527 ops/sec ±3.79% (65 runs sampled)
[1] undici - stream x 3,885 ops/sec ±3.37% (71 runs sampled)
[1] undici - dispatch x 3,842 ops/sec ±4.04% (65 runs sampled)
[1] undici - noop x 4,524 ops/sec ±1.35% (67 runs sampled)

llhttp_wasm:

> [email protected] bench /home/dnlup/Dev/dnlup/undici
> npx concurrently -k -s first "node benchmarks/server.js" "node -e 'setTimeout(() => {}, 1000)' && node benchmarks"

[1] http - no agent  x 420 ops/sec ±4.12% (50 runs sampled)
[1] http - keepalive x 441 ops/sec ±2.88% (52 runs sampled)
[1] http - keepalive - multiple sockets x 3,400 ops/sec ±2.81% (78 runs sampled)
[1] undici - pipeline x 3,813 ops/sec ±2.96% (72 runs sampled)
[1] undici - request x 3,578 ops/sec ±2.83% (67 runs sampled)
[1] undici - pool - request - multiple sockets x 3,830 ops/sec ±3.83% (69 runs sampled)
[1] undici - stream x 3,773 ops/sec ±3.81% (65 runs sampled)
[1] undici - dispatch x 3,976 ops/sec ±3.07% (71 runs sampled)
[1] undici - noop x 4,302 ops/sec ±2.46% (58 runs sampled)

Not sure my results are lower than average on both branches, but I thought it would have been worse.

@mcollina
Copy link
Member

mcollina commented Mar 2, 2021

This is incredibly positive. Good work @dnlup!

lib/llhttp/parser.js Outdated Show resolved Hide resolved
@dnlup
Copy link
Contributor Author

dnlup commented Mar 4, 2021

The only test remaining is https://github.com/nodejs/undici/blob/master/test/client-pipeline.js#L975 .

The buffers differ by 2 bytes, it looks like the first 2 from the tap output. I manually checked the chunks in the on('data') callback and they have all the same size. The kOnBody is different now because we have direct access to the body buffer and we don't have the offsets from the whole message buffer (at least I could find them, it looks like it's implemented in the Node parser and not in llhttp). I'll see if I can solve this.

@ronag
Copy link
Member

ronag commented Mar 4, 2021

Could maybe be a string encoding thing?

@dnlup
Copy link
Contributor Author

dnlup commented Mar 4, 2021

Could maybe be a string encoding thing?

It makes sense, how could I test it?

build/llhttp.js Show resolved Hide resolved
@dnlup
Copy link
Contributor Author

dnlup commented Mar 7, 2021

I think if there has been an error then the state is pretty much undefined and the parser should not be used anymore. I don't think we do but it might be worth to always fail execute is called after having failed. Maybe unnecessary?

I'll dig into it. There's a reset function in llhttp that we are not using. I'll check what the Node parser is doing.

@ronag
Copy link
Member

ronag commented Mar 7, 2021

I'll dig into it. There's a reset function in llhttp that we are not using. I'll check what the Node parser is doing.

I don't think we need to fix this since we always destroy the socket and parser on error anyway. I just think we should have an assertion or error as to not make mistakes in the future.

const { execSync } = require('child_process')
const TMP = require('os').tmpdir()

const REPO = '[email protected]:dnlup/llhttp.git'
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would prefer if we used the upstream repository instead and we copy over whatever files we need.

Alternatively, I'd be ok in doing that adds this to llhttp itself.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would prefer if we used the upstream repository instead and we copy over whatever files we need.

Expanding on https://github.com/nodejs/undici/pull/575/files#r589005221 (and thinking out loud), we need to build the c files prior to build the wasm binary (and the source files are the same llhttp is using to build the parser), I think moving that out of llhttp would require us to duplicate a lot of build logic in here. Like, we would need the c src files, the js files to build the parser, add the build scripts.

Alternatively, I'd be ok in doing that adds this to llhttp itself.

I'll open a PR to llhttp and see what happens.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, here's the PR:

nodejs/llhttp#93

This implement a minimal parser that uses llhttp built with wasm.
A custom repository was setup to have a custom wasm build.

Benchmarks of the current changes:

http - no agent  x 436 ops/sec ±4.06% (54 runs sampled)
http - keepalive x 453 ops/sec ±2.72% (54 runs sampled)
http - keepalive - multiple sockets x 3,191 ops/sec ±2.30% (73 runs sampled)
undici - pipeline x 4,062 ops/sec ±3.14% (73 runs sampled)
undici - request x 3,925 ops/sec ±2.81% (74 runs sampled)
undici - pool - request - multiple sockets x 4,045 ops/sec ±3.36% (76 runs sampled)
undici - stream x 3,912 ops/sec ±2.69% (74 runs sampled)
undici - dispatch x 4,371 ops/sec ±1.96% (79 runs sampled)
undici - noop x 4,237 ops/sec ±2.58% (59 runs sampled)

Benchmarks of the stable branch:

http - no agent  x 410 ops/sec ±4.18% (63 runs sampled)
http - keepalive x 446 ops/sec ±2.75% (53 runs sampled)
http - keepalive - multiple sockets x 3,248 ops/sec ±2.97% (73 runs sampled)
undici - pipeline x 3,666 ops/sec ±3.18% (73 runs sampled)
undici - request x 3,951 ops/sec ±3.04% (72 runs sampled)
undici - pool - request - multiple sockets x 4,017 ops/sec ±3.16% (75 runs sampled)
undici - stream x 4,014 ops/sec ±3.89% (70 runs sampled)
undici - dispatch x 3,823 ops/sec ±3.46% (67 runs sampled)
undici - noop x 4,569 ops/sec ±1.99% (59 runs sampled)
@dnlup
Copy link
Contributor Author

dnlup commented Mar 8, 2021

I think if there has been an error then the state is pretty much undefined and the parser should not be used anymore. I don't think we do but it might be worth to always fail execute is called after having failed. Maybe unnecessary?

For what is worth, I think it's already handled inside llhttp by continuing to return the last error if execute returns a non-pause code:

https://github.com/nodejs/llhttp/blob/master/src/native/api.h#L81

That should already cause a request to fail because execute would return an error.

@ronag
Copy link
Member

ronag commented Mar 11, 2021

Was reading nodejs/node#37676 (comment) and got curious whether our wasm build is strict or loose?

@dnlup
Copy link
Contributor Author

dnlup commented Mar 11, 2021

Was reading nodejs/node#37676 (comment) and got curious whether our wasm build is strict or loose?

I suspect it is strict. I don't see much difference between the node build and the wasm build. How should Node disable strict? Maybe removing this definition?

Base automatically changed from master to main March 12, 2021 21:33
@ronag
Copy link
Member

ronag commented Mar 23, 2021

@dnlup @mcollina should we try to pull in all the required llhttp changes into undici (i.e. the wasm parts)? Doing things through llhttp seems slow and I feel it will stall progress for future work as well.

@dnlup
Copy link
Contributor Author

dnlup commented Mar 23, 2021

@dnlup @mcollina should we try to pull in all the required llhttp changes into undici (i.e. the wasm parts)? Doing things through llhttp seems slow and I feel it will stall progress for future work as well.

We would need to commit the c source, the constants.js file and add some build scripts. These are the first things the come to my mind.

@ronag
Copy link
Member

ronag commented Mar 23, 2021

I wouldn't mind that.

@mcollina
Copy link
Member

I'm ok in vendoring the full of the llhttp source and apply patches on top/put the automation in our own tree.

@dnlup dnlup mentioned this pull request Mar 23, 2021
@ronag ronag added this to the 4.0 milestone Mar 24, 2021
@ronag
Copy link
Member

ronag commented Mar 25, 2021

@mcollina can we merge this once conflicts are resolved?

Copy link
Member

@mcollina mcollina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@mcollina
Copy link
Member

This can land if CI pass

@dnlup
Copy link
Contributor Author

dnlup commented Mar 25, 2021

I think CI is fine, it's stuck for some reason. I am waiting on #601 to remove the deps folder from the bundled assets.

@dnlup
Copy link
Contributor Author

dnlup commented Mar 25, 2021

Apologies, the comment regards #611 . I think we should merge that PR.

@ronag
Copy link
Member

ronag commented Mar 25, 2021

Apologies, the comment regards #611 . I think we should merge that PR.

I'm confused. Merge into this PR? Or what?

@dnlup
Copy link
Contributor Author

dnlup commented Mar 25, 2021

Apologies, the comment regards #611 . I think we should merge that PR.

I'm confused. Merge into this PR? Or what?

Merge into main that PR directly. It has the same code, it just embeds the native sources and the build script.

@ronag
Copy link
Member

ronag commented Mar 25, 2021

Merge into main that PR directly. It has the same code, it just embeds the native sources and the build script.

So close this PR?

@mcollina mcollina closed this Mar 25, 2021
@dnlup
Copy link
Contributor Author

dnlup commented Mar 25, 2021

Merge into main that PR directly. It has the same code, it just embeds the native sources and the build script.

So close this PR?

Yes, you're right. I guess I was waiting to see if this was still doable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Use llhttp through wasm
4 participants