-
Notifications
You must be signed in to change notification settings - Fork 308
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Log how long each extension module takes to import #1171
Conversation
Based on some detailed debugging in 2i2c-org/infrastructure#2047 to figure out why a Jupyter Server *process* somtimes takes more than 30s to start, the primary culprit was server extensions that took multiple seconds to just even import. Thanks to some ad-hoc patching (2i2c-org/infrastructure#2047 (comment)), I was able to figure out which were the slow extensions. This PR emits extension import time as log messages, so this information is *much* more visible. I also explicitly chose info instead of debug, primarily because I believe it is *very* important to surface this performance information to users, so they can go bug the appropriate extension. Otherwise, it just feels like 'jupyter server is slow!'. This is compounded by the fact that while notebook server doesn't import *disabled* extensions, jupyter_server does seem to - so it's hard to isolate this.
for more information, see https://pre-commit.ci
Codecov ReportBase: 79.93% // Head: 79.56% // Decreases project coverage by
Additional details and impacted files@@ Coverage Diff @@
## main #1171 +/- ##
==========================================
- Coverage 79.93% 79.56% -0.37%
==========================================
Files 68 68
Lines 8124 8130 +6
Branches 1601 1602 +1
==========================================
- Hits 6494 6469 -25
- Misses 1205 1222 +17
- Partials 425 439 +14
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report at Codecov. |
Lint failure seems totally unrelated? |
Thanks @yuvipanda, yes it is unrelated. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Steve! (Sorry, I had reviewed, but got side tracked prior to submitting my approval.)
Update: Damn, just realized I commented on the wrong PR! (Thanks Yuvi, and Steve & David anyway though 😄)
Thanks a lot for getting this out quickly, @davidbrochart and @blink1073 :) |
Without this, the log messages printed in jupyter-server#1171 don't actually make it out. Whoops! LoggingConfigurable inherits from HasTraits, and sets up Logging appropriately so we can have a log passed through.
Without this, the log messages printed in jupyter-server#1171 don't actually make it out. Whoops! LoggingConfigurable inherits from HasTraits, and sets up Logging appropriately so we can have a log passed through.
Without this, the log messages printed in #1171 don't actually make it out. Whoops! LoggingConfigurable inherits from HasTraits, and sets up Logging appropriately so we can have a log passed through.
Based on some detailed debugging in
2i2c-org/infrastructure#2047 to figure out why a Jupyter Server process sometimes takes more than 30s
to start, the primary culprit was server extensions that took multiple seconds to just even import. Thanks to some ad-hoc patching (2i2c-org/infrastructure#2047 (comment)), I was able to figure out which were the slow extensions. This PR emits extension import time as log messages, so this information is much more visible.
I also explicitly chose info instead of debug, primarily because I believe it is very important to surface this performance information to users, so they can go bug the appropriate extension. Otherwise, it just feels like 'jupyter server is slow!'. This is compounded by the fact that while notebook server doesn't import disabled extensions, jupyter_server does seem to - so it's hard to isolate this.