-
Notifications
You must be signed in to change notification settings - Fork 9.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
misc: update stack packs, remove duplicated stack pack files #11396
Conversation
@connorjclark should we do the revert of #11370 and reland with a test once the upstream fix happens before master drifts too far and we build on top of it? Or am I misunderstanding the impact of the bug over there? (It's my understanding that currently no stack packs will render at all) |
@patrickhulce, I'm about to update the repo. I had to wait to get admin access. It's a quick fix. |
could we get a test case using inline snapshots that the stack packs match audit ids? :D |
No, because there are plugins inside these stack packs. I probably misunderstood, because how would inline snapshots do this? can you say more? |
That's why I said "using inline snapshots" :) Update: Ah yeah sure here's an example of what I'm saying Example: const nonMatchingStackPackKeys = // magic
expect(nonMatchingStackPackKeys).toMatchInlineSnapshot(`
// this list better be small and not change or we know somethings wrong :)
`) |
that feels a bit too much like a duplicate of sample JSON we won't notice 😆 I was thinkin something more like the psuedocode below
|
ah ok cool. let's do both 🎉 |
|
||
expect([...unrecognizedKeys]).toMatchInlineSnapshot(` | ||
Array [ | ||
"time-to-first-byte", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
huzzah catching bugs already! I'm guessing it should be server-response-time
since it was renamed from time-to-first-byte
:)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cool, but punt GoogleChrome/lighthouse-stack-packs#50
edit: or maybe dont punt. idk
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sure punt is fine, wasn't trying to flag this as a blocker, just celebrating the success of its implementation 🎉 :)
describe('lighthouse-stack-packs', () => { | ||
it('snapshot all strings', () => { | ||
/* eslint-disable max-len */ | ||
expect(lighthouseStackPacks).toMatchInlineSnapshot(` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Snapshot testing the entirety of the stack packs seems a little excessive? We don't have a snapshot test for all of axe.min.js
, for instance :)
Is there an underlying purpose that can be served with something more succinct?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The test that @patrickhulce suggested does a more narrow check for a problem we've seen already.
This one is larger, a catch all for whatever strings that end up in Lighthouse. A chance for us to stop something bad from being published.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe what we're looking for here then @connorjclark is a comment expressing what a reviewer should look for? Right now I'm not sure what changes to this snapshot I should expect and what changes to this would be bad (which was the same core problem in missing the -
-> _
change in the sample_v2.json)
I think what @brendankenny was saying if we're trying to prevent something like "all of these strings disappeared" then we can test that with a narrower snapshot assertion than having to inspect all of these strings manually.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[rather ...] than having to inspect all of these strings manually.
But we should check these manually. The release process for this package, while still handled by Googlers, is not 100% in our direct control. It's maybe not a realistic attack vector, but a quick 5-10s glance every quarter when we update this package is not an expensive thing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But we should check these manually. The release process for this package, while still handled by Googlers, is not 100% in our direct control. It's maybe not a realistic attack vector, but a quick 5-10s glance every quarter when we update this package is not an expensive thing.
That's not really a test (unit or otherwise), though, it's a "hey human, check all of this for...something" (which we're also doing for the same strings in en-US.json
and x50 for all the other languages when the translations come in).
If this is the chief need for this check, we should be explicit about it and do a periodic static copy of lighthouse-stack-packs
into third-party/
. Then it's crystal clear that what really needs to happen is a code review.
That seems like overkill, though. Really we care about things like, are joomla strings returned when getStackPacks
is called with joomla detected, are strings displayed by the renderer when they're in sample_v2
like they are, etc. Pushing the basic responsibilities to the separate repo seems like part of the point of having it in a separate repo.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is not 100% in our direct control. It's maybe not a realistic attack vector
and if we should be worried about this, the renderer is ok because no string gets privileged treatment, but we should really bulk up collect-strings
and do some unknown
checking because each string in there costs us $$ for translations :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated the tests
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall LGTM
`); | ||
}); | ||
|
||
// Keys should only be added, not removed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this instructions for someone updating the snapshot or a description of the test?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for updating the snapshot. moved it down a bit.
fixes the regression b/c of underscores (see #11370 (comment))
also forgot to delete some old files in #10138.
had to do another i18n import