Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reference implementation of defer and stream spec #3659

Merged
merged 13 commits into from
Aug 30, 2022
Merged

Conversation

robrichard
Copy link
Contributor

@robrichard robrichard commented Jun 23, 2022

Continued from #2839

This is the reference impementation of the defer/stream spec proposal. The corresponding PR for spec changes are here: graphql/graphql-spec#742

Please limit comments on this PR to code review of this implementation. Discussion on the defer/stream spec is located in a dedicated repo: https://github.com/robrichard/defer-stream-wg/discussions

@netlify
Copy link

netlify bot commented Jun 23, 2022

Deploy Preview for compassionate-pike-271cb3 ready!

Name Link
🔨 Latest commit ee0ea6c
🔍 Latest deploy log https://app.netlify.com/sites/compassionate-pike-271cb3/deploys/63081fa2f330ca0008a48404
😎 Deploy Preview https://deploy-preview-3659--compassionate-pike-271cb3.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site settings.

@github-actions
Copy link

Hi @robrichard, I'm @github-actions bot happy to help you with this PR 👋

Supported commands

Please post this commands in separate comments and only one per comment:

  • @github-actions run-benchmark - Run benchmark comparing base and merge commits for this PR
  • @github-actions publish-pr-on-npm - Build package from this PR and publish it on NPM

@robrichard

This comment has been minimized.

@github-actions
Copy link

@github-actions publish-pr-on-npm

@robrichard The latest changes of this PR are available on NPM as
graphql@17.0.0-alpha.1.canary.pr.3659.cef660554446d49cec9a0958afb9690dd0b19193
Note: no gurantees provided so please use your own discretion.

Also you can depend on latest version built from this PR:
npm install --save graphql@canary-pr-3659

@glasser

This comment has been minimized.

@github-actions
Copy link

@github-actions publish-pr-on-npm

@glasser The latest changes of this PR are available on NPM as
graphql@17.0.0-alpha.1.canary.pr.3659.735abf5edacd99b712ddb40d89bd8b213640eb07
Note: no gurantees provided so please use your own discretion.

Also you can depend on latest version built from this PR:
npm install --save graphql@canary-pr-3659

@glasser
Copy link
Contributor

glasser commented Aug 16, 2022

@robrichard @IvanGoncharov I'm finding the types a bit hard to follow here.

So execute can either return (perhaps via Promise) a single ExecutionResult or an async generator of AsyncExecutionResults. That might make me think that the former case happens when incremental delivery is not involved, and the latter happens with incremental delivery.

However, ExecutionResult itself has hasNext and incremental so I guess that's not true. But maybe if you get an ExecutionResult directly from execute it won't have those fields set?

AsyncExecutionResult can be ExecutionResult or SubsequentExecutionResult. The difference is that only ExecutionResult has errors and data; SubsequentExecutionResult has those nested inside incremental.

But I don't get why ExecutionResult can have errors and data both at the top level and nested inside incremental.

I also don't get why ExecutionResult and SubsequentExecutionResult have extension both at the top level and nested inside incremental.

This would make a lot more sense to me if there was the "non incremental delivery" return value that only has data/errors/extensions at the top level, and the "incremental delivery" return value that only has data/errors/extensions nested inside "incremental". Am I missing something?

I'll take a look at the source too to see if that helps me understand it better.

EDITED FOLLOWUP

OK, it looks like the reason that all the fields of AsyncExecutionResult are also on ExecutionResult is related to the use of flattenAsyncIterable in mapSourceToResponse, a subscription-specific function that I don't really understand. This seems like it ought to hopefully be avoidable?

src/execution/execute.ts Outdated Show resolved Hide resolved
src/execution/execute.ts Outdated Show resolved Hide resolved
@glasser
Copy link
Contributor

glasser commented Aug 16, 2022

I tried to simplify the types as described above. #3703 is what I came up with.

A lot of the complexity comes from the incremental field; this is a pretty recent addition, right? I'm not sure I really understand why it exists rather than the fields on it going directly on multiple top-level messages. Something about being able to deliver multiple deferred chunks at once? Does incremental get actually written in the JSON on the wire?

src/execution/execute.ts Outdated Show resolved Hide resolved
@glasser
Copy link
Contributor

glasser commented Aug 18, 2022

Having tried to implement consuming in Apollo Server, I'm increasingly convinced that (assuming we don't want the first message to have an incremental field, which seems to be the consensus) we should make execute (or its replacement) always return a "first execution result" (which should never have an incremental field), and then if anything is deferred/streamed, also return an async iterator of "subsequent execution results" (which should always have an incremental field). That works a lot better from the consumer side than the current async iterator which is "never empty and the first one has a different type from the rest". I would also much rather have a nice strongly typed optional field with the subsequent result iterator than have to probe a result to figure out what kind it is (basically requiring me to copy and paste isAsyncIterator into my codebase).

@yaacovCR
Copy link
Contributor

yaacovCR commented Aug 23, 2022

We've hit a few concurrency bugs already (#2975, #3710) with the "raw" async generator objects currently used within incremental delivery.

In #3694, we have a PR to introduce @brainkim 's repeaters into graphql-js to ensure that our returned async generator objects actually behave exactly like async generators by design.

In particular:

  1. Calls to next(), return(), and throw() will not interfere with each other; they will be executed sequentially.
  2. The promises returned by next(), return(), and throw() will all settle in call order.

graphql-executor builds on Repeaters to create a publisher/dispatcher abstraction that allows the executor to push results to the generator as they are ready, without the need for any complicated promise racing.

#3694 just uses Repeaters to implement mapAsyncIterable, but graphql-executor does the same for flattenAsyncIterable, (as well as using them for the publisher/dispatcher).

@robrichard

This comment has been minimized.

@github-actions
Copy link

@github-actions publish-pr-on-npm

@robrichard The latest changes of this PR are available on NPM as
graphql@17.0.0-alpha.1.canary.pr.3659.5dba20aef36112d13569d5f296ef967383e60d0f
Note: no gurantees provided so please use your own discretion.

Also you can depend on latest version built from this PR:
npm install --save graphql@canary-pr-3659

@yaacovCR
Copy link
Contributor

yaacovCR commented Aug 25, 2022

Hmmm, looks like I needed a longer think on all this.

@glasser -- some helpful input from you (based on your comment within the flattenAsyncIterable iterator as to why it's important to support concurrent return calls:

Some links I've seen around about this:

  1. @IvanGoncharov pointed me to the iterator helpers proposal (which in my read doesn't have them).

  2. This TC39 issue, and in particular this comment from @freiksenet and these two comments from @brainkim there.

Note that @freiksenet points out that allowing concurrent returns will allow the inner iterator to kick off "return/clean up," but as @freiksenet and @brainkim point out, if there is a prior .next() that never settles, memory will still leak => the solution is to make sure that all long running promises eventually settle, i.e. no iterator is truly infinite which can only really be done at the source, and once done, will close the memory leak and allow the cleanup code to be called.

So it seems to me that this concurrent returns aren't quite a must, but I really would like to get others opinions on this!

@glasser
Copy link
Contributor

glasser commented Aug 25, 2022

One option is to just avoid async iterators entirely. @benjamn privately suggested to me the idea that each chunk just contains a next(): Promise<Chunk>; there's no complexity about parallel calls here because each next() call on the same chunk always would return the same next chunk. You'd need some other explicit close() or cancel() or whatever call. The main advantage of async iterators to me is just that it's how graphql-js already works for subscriptions.

Fortunately this is purely a question about how to handle this in graphql-js, not something that should block the overall of the spec update from merging. I think sticking with the same API as subscriptions makes sense, and graphql-js could consider later creating a second new API (both could be supported) for following these streams if we want.

src/execution/execute.ts Outdated Show resolved Hide resolved
@glasser
Copy link
Contributor

glasser commented Aug 25, 2022

(BTW my basic opinion about concurrent next/returns is that if we claim to implement AsyncIterator then we should do so correctly, but if we eventually decide a simpler API is good enough for our use case then that's good too. For the specific case of defer/stream results, the ordering between chunks is quite important — the chunks are designed to be applied in order. So the ability to wait on multiple next calls in parallel (or to use return for anything other than cleanup) is not really important if you ask me.)

@michaelstaib
Copy link
Member

@robrichard is the new items name for the result already implemented? Went through the tests but did not find any result with the items field for stream results.

@robrichard
Copy link
Contributor Author

@michaelstaib
Copy link
Member

Ah, overlooked this one. Thanks for pointing this one out.

@calvincestari
Copy link

calvincestari commented Jul 21, 2023

@robrichard - I know this was a while ago but is it correct that GraphQLDeferDirective and GraphQLStreamDirective were not added to the specifiedDirectives list? For schemas that want to use those directives should they manually add them to the config/schema directive lists?

Update: https://github.com/graphql/graphql-js/blob/main/website/docs/tutorials/defer-stream.md

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
PR: feature 🚀 requires increase of "minor" version number
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants