Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

core(optimized-images): cap execution to 5 seconds #7237

Merged
merged 4 commits into from
Mar 7, 2019

Conversation

patrickhulce
Copy link
Collaborator

Summary
We cap execution of our OptimizedImage gatherer at 5s and estimate the savings on all the images we didn't get to.

I know it got big, but it's really just the capping + all the consequences of capping + tests.

Related Issues/PRs
closes #7173

Copy link
Member

@paulirish paulirish left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If i'm reading this right...

  1. We don't try to reencode any images >2MB; instead we'll rely on the estimate.
  2. For any images <=2MB, we sort them by size descending.. so starting with 1.9MB and going down, we do the reencode with jpg/webp..
  3. We keep doing this until we've finished or we hit 5s total.
  4. At that point we stop
  5. Over in the audits if we don't have a jpegSize (because it was >2mb or we ran out of time), we'll use the estimate.

Yes?

To me the only tricky thing is that we don't attempt images >2mb but prioritize all results by size. But I understand the tradeoff here between precision of our estimate and runtime.

This looks great to me.

@patrickhulce
Copy link
Collaborator Author

Yes?

Yes!

To me the only tricky thing is that we don't attempt images >2mb but prioritize all results by size. But I understand the tradeoff here between precision of our estimate and runtime.

Yeah, this limit to me is more that we should be flagging this image via other means. If you're serving an image over 2MB, it's not just an "encode with the right JPEG settings issue" it's a "this image is way too effin' big issue", so let's not spend time focusing on it (related #7236). I'm open to other constants for this though. Maybe 3MB? 5MB?

Assuming it falls into the optimization bucket though, it does make sense to prioritize the biggest byte savings first.

@paulirish
Copy link
Member

Yeah, this limit to me is more that we should be flagging this image via other means. If you're serving an image over 2MB, it's not just an "encode with the right JPEG settings issue" it's a "this image is way too effin' big issue", so let's not spend time focusing on it

Yah I like that approach.

Works for me.

@patrickhulce patrickhulce merged commit 5674d61 into master Mar 7, 2019
@patrickhulce patrickhulce deleted the optimized_images branch March 7, 2019 19:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

OptimizedImages gatherer can take >20s
2 participants