Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(performance): pre-compute and lazily initialize endpoint methods #622

Merged
merged 3 commits into from
Jun 14, 2023

Conversation

ZauberNerd
Copy link
Contributor

We have observed that the octokit initialisation is quite slow,
especially in combination with probots
which creates a new octokit instance for each incoming request.
This causes the main event loop to block for a considerable time.

With our change we moved the preparation of the endpoints api object
into the module scope and use a Proxy object to defer the initialisation
of octokit defaults and decorations to the first API call per method.

Although we have not measured it, we believe the overhead that comes
from the proxied method call is insignificant in comparison to the
network latency of an API call.


We have profiled our change with a flamegraph generated with Clinic.js.
The two screenshots below show the before and after state of 10k octokit initialisations:

for (let i = 0; i < 10000; i++) {
  const MyOctokit = Octokit.plugin(restEndpointMethods);
  const octokit = new MyOctokit({
    auth: `personal-access-token123`,
  });
}

Before
Screenshot from 2023-03-08 11-57-17

After
Screenshot from 2023-03-08 11-56-23

We have observed that the octokit initialisation is quite slow,
especially in combination with [probots](https://github.com/probot/probot)
which creates a new octokit instance for each incoming request.
This causes the main event loop to block for a considerable time.

With our change we moved the preparation of the endpoints api object
into the module scope and use a Proxy object to defer the initialisation
of octokit defaults and decorations to the first API call per method.

Although we have not measured it, we believe the overhead that comes
from the proxied method call is insignificant in comparison to the
network latency of an API call.

Co-authored-by: Markus Wolf <[email protected]>
@wolfy1339 wolfy1339 added the Type: Feature New feature or request label Mar 8, 2023
@wolfy1339 wolfy1339 requested a review from gr2m March 8, 2023 15:30
gr2m
gr2m previously approved these changes May 8, 2023
Copy link
Contributor

@gr2m gr2m left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fascinating. I didn't have a chance to use Proxies yet, but I think the usage makes a lot of sense here. I'm always reluctant about internal caching like this because the first call of the method behaves different then the second call, and if there is rare error condition it's very hard to debug. But given that in this package the methods are very monotonous, and the improvement quite significant, I think it's worth it.

Could you please add a comment with a short description of how it works and why this optimization was done, with a reference to this pull request?

@gr2m
Copy link
Contributor

gr2m commented Jun 13, 2023

Could you please add a comment with a short description of how it works and why this optimization was done, with a reference to this pull request?

@ZauberNerd could you do that please? No worries if not, I'd say we can add that ourselves and merge the PR

@ZauberNerd
Copy link
Contributor Author

@gr2m sorry for the delay. I've added a comment and fixed a typo (enpointMethodsMap => endpointMethodsMap).

@gr2m gr2m changed the title perf: pre-compute and lazily initialise octokit endpoint methods fix(performance): pre-compute and lazily initialize endpoint methods Jun 14, 2023
@gr2m gr2m merged commit a7887d0 into octokit:main Jun 14, 2023
@gr2m
Copy link
Contributor

gr2m commented Jun 14, 2023

Thanks @ZauberNerd! The new release should be out any moment. It will be a fix release so if you reset your lock file you should get the update. Let us know if it all works as expected

@github-actions
Copy link
Contributor

🎉 This PR is included in version 7.2.3 🎉

The release is available on:

Your semantic-release bot 📦🚀

@ZauberNerd ZauberNerd deleted the improve-initialisation branch June 14, 2023 19:22
@ZauberNerd
Copy link
Contributor Author

Thanks @ZauberNerd! The new release should be out any moment. It will be a fix release so if you reset your lock file you should get the update. Let us know if it all works as expected

I'm using this in a probot setup and probot is still on v5 of this dependency.
But I have been running this change via patch-package for a few months now and so far seen no problems.

@github-actions
Copy link
Contributor

🎉 This PR is included in version 8.0.0-beta.3 🎉

The release is available on:

Your semantic-release bot 📦🚀

ZauberNerd added a commit to ZauberNerd/plugin-rest-endpoint-methods.js that referenced this pull request Oct 11, 2023
The implementation of octokit#622 only implemented a `get` Proxy trap.
This worked fine for the simple use-case of invoking the methods, but
failed when users tried to mock functions with Jest or Sinon.
It also did not list all functions anymore, when pressing tab-tab in a
REPL.

This commit implements further Proxy traps which are required for those
use-cases and it uses the cache object for mutating operations.

Fixes: octokit#683
wolfy1339 pushed a commit that referenced this pull request Oct 11, 2023
The implementation of #622 only implemented a `get` Proxy trap.
This worked fine for the simple use-case of invoking the methods, but
failed when users tried to mock functions with Jest or Sinon.
It also did not list all functions anymore, when pressing tab-tab in a
REPL.

This commit implements further Proxy traps which are required for those
use-cases and it uses the cache object for mutating operations.

Fixes: #683
Copy link
Contributor

🎉 This issue has been resolved in version 7.2.3 🎉

The release is available on:

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

3 participants