Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add visual indicator to score to warn when it was run with non-default settings #8178

Open
av3nger opened this issue Apr 11, 2019 · 7 comments
Assignees
Labels
docs needs-design P1.5 PSI/LR PageSpeed Insights and Lightrider report

Comments

@av3nger
Copy link

av3nger commented Apr 11, 2019

I am seeing inconsistent ratings for same metrics when doing scans from PageSpeed Insights vs a Lighthouse scan from Chrome. For example, a scan from PageSpeed Insights is rating the First Meaningful Paint at 1.2 seconds and marking it as a warning (see first screenshot), while a scan from Chrome is marking the same metric even at 1.9 seconds as a pass (see second screenshot). Is that a bug?

online

browser

@patrickhulce
Copy link
Collaborator

@av3nger are you looking at the desktop numbers in PSI by chance?

What two Lighthouse settings are you comparing? :)

@av3nger
Copy link
Author

av3nger commented Apr 11, 2019

@patrickhulce , both reports are for the desktop version. I'm trying to understand why PageSpeed Insights are highlighting the score in yellow, when it's a green?

I found this spreadsheet. And according to the spreadsheet, the 2.1 second Speed Index is between 99-100 score, which should make the coloring of the metric green. Same for the first meaningful paint metric.

PageSpeed Insights:
pagespeed
Chrome Litghthouse:
chrome

@patrickhulce
Copy link
Collaborator

The default scoring (including the spreadsheet) assumes mobile conditions. PSI adjusts the scoring for desktop according to the differences in connection and processing power.

Because there can be any mixture of throttling conditions while auditing a page in DevTools and the environment is not as predictable as PSI, the scoring rubric is not adjusted there which is why you are seeing the discrepancy. Essentially, assumptions about the comparability of the score are off when using any settings other than default throttling.

cc @paulirish @hwikyounglee I wonder if there should be some sort of visual indicator when Lighthouse was run with anything but the default settings that the score may be unexpected 🤔

@av3nger
Copy link
Author

av3nger commented Apr 11, 2019

@patrickhulce, thank you for the explanation. This should be definitely explained much better. It's way too confusing at the moment - the spreadsheet is explaining the scores, you're thinking - ok, I just need 3,785 ms or less, and the speed index metric will be scored as "fast". Then you do a PSI scan, that is using Lighthouse, get the speed metric value of 2.1 seconds and you're labeled as "average"... 🤦‍♂️

@brendankenny brendankenny changed the title Metrics score ratings Add visual indicator to score to warn when it was run with non-default settings Jun 26, 2019
@paulirish paulirish added PSI/LR PageSpeed Insights and Lightrider and removed pagespeed-insights labels Jul 23, 2019
@connorjclark
Copy link
Collaborator

at one point we considered showing the final-screenshot in the report, and wrapping that with the form factor (like a mobile device frame) as a way to indicate that the report is distinct from others. might apply here?

@paulirish
Copy link
Member

mock from #9379

image

@connorjclark
Copy link
Collaborator

also considering showing the scoring curve in the metrics table somehow

not this exactly but the information we want to show is the same (the FCP mobile curve):
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs needs-design P1.5 PSI/LR PageSpeed Insights and Lightrider report
Projects
None yet
Development

No branches or pull requests

5 participants