Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chrome-extension scripts included in audit report script timings #4878

Closed
depinski opened this issue Mar 27, 2018 · 10 comments · Fixed by #5666
Closed

chrome-extension scripts included in audit report script timings #4878

depinski opened this issue Mar 27, 2018 · 10 comments · Fixed by #5666

Comments

@depinski
Copy link

This is not useful and confusing. Either split out the report into sections "Script loaded by page" and "Scripts loaded by your browser" and provide them in a sparate total values or remove the chrome-extension scripts. I'm analyzing a site and more than a 1000ms of the 2200 ms JS loading time is from chrome-extensions.

@patrickhulce
Copy link
Collaborator

This is an accurate reflection of how the extension has impacted the performance of the page. If you're measuring page load with extensions enabled, then unfortunately the performance measured is inseparable from the extension even after the fact. Use the CLI/incognito/a clean browser profile for auditing if you don't wish to be impacted by your extensions.

As for filtering the report items, #4516 tracks adding a filter for resources you don't have control over. If that wouldn't fully cover your use case, could you add to that issue what you'd like to see?

@docluv
Copy link

docluv commented May 10, 2018

This is a bad practice. I get using incognito or CLI, but the average person wont know to do that and will get this horrible result that honestly is not accurate. I don't even know why the extensions are even part of the same process. They interfere with the service worker fetch handler as well.

As far as the script load impact, extensions are > 95% of the delays I get a bad performance score.

@Nadeeshyama
Copy link

It makes sense to test your own code without content scripts getting in the way. Whilst its true that content scripts can effect page performance, there is no way to control this as a developer since the end-user may install any extension on their browser. Perhaps make it an option to disable or hide content scripts while the test is running.

@centminmod
Copy link

it's happening in Chrome 67 even in incognito mode for me, chrome-extensions is included in tests i.e. critical chain etc

@patrickhulce
Copy link
Collaborator

@centminmod you must have some extensions enabled in incognito mode then. It's a toggle on each extension in the chrome://extensions settings page if you want to allow it in incognito too.

FWIW, this is also why the default setup for LH CLI is to use a fresh Chrome profile that's isolated.

@NickWoodward
Copy link

I don't get it. Chrome knows it's an extension and yet still reports it in lighthouse. What end user is using lighthouse? And I don't see how it's useful to Devs when they have no idea what extensions are used?

@paulirish
Copy link
Member

@NickWoodward if you have some extremely slow chrome extensions... (like ones that do a bunch of JS on every single DOM mutation... not uncommon for ones like Grammarly).. In this case, the extensions themselves can significantly worsen the performance of the page. Are you saying you would prefer the report excludes the attribution so you just assume its the web page's fault?

@NickWoodward
Copy link

NickWoodward commented Jun 1, 2024

@NickWoodward if you have some extremely slow chrome extensions... (like ones that do a bunch of JS on every single DOM mutation... not uncommon for ones like Grammarly).. In this case, the extensions themselves can significantly worsen the performance of the page. Are you saying you would prefer the report excludes the attribution so you just assume its the web page's fault?

100%. I want to be able to assume it's the webpage's fault because that's what I control as a developer. Extensions vary across users' browsers, so arbitrarily testing the ones I have installed doesn't make much sense to me? (no doubt out of ignorance). That's why I seemed confused - unless lighthouse is also a tool for testing browser extensions? (and even then that doesn't seem like it should be the default) Outside of that I really don't get the usecase? Genuinely question and I appreciate the quick reply, thanks!

@connorjclark
Copy link
Collaborator

connorjclark commented Jun 1, 2024

The current issue here is really we can't control when extensions are enabled, only the user can. There would be some non-trivial amount of Chromium work in order to disable extensions during performance tests. I have kickstarted a discussion internally on the possibility of disabling extensions on the user's behalf. Until then, the best course for Lighthouse is to detect when extensions are present and recommend disabling them in order to remove their impact from the report.

@NickWoodward
Copy link

NickWoodward commented Jun 1, 2024

The current issue here is really we can't control when extensions are enabled, only the user can. There would be some non-trivial amount of Chromium work in order to disable extensions during performance tests. I have kickstarted a discussion internally on the possibility of disabling extensions on the user's behalf. Until then, the best course for Lighthouse is to detect when extensions are present and recommend disabling them in order to remove their impact from the report.

Ah ok, that makes sense - so I was confusing being able to detect them as extensions with being able to do something about them. Thanks for the reply!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

8 participants