Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

make reduction-effort a configuration item #289

Closed
frankenbubble opened this issue Jun 16, 2021 · 2 comments
Closed

make reduction-effort a configuration item #289

frankenbubble opened this issue Jun 16, 2021 · 2 comments

Comments

@frankenbubble
Copy link

currently we can't configure reduction-effort, which defaults to a value of 4, this means resizing jpegs to larger webp files uses more cpu to reduce file size. This results in higher latency, for a small size trade off

make reduction-effort a configuration item, we would like to set this to 0 (zero)

see https://sharp.pixelplumbing.com/api-output#webp
options.reductionEffort number level of CPU effort to reduce file size, integer 0-6 (optional, default 4)

having started using this service in production, we noticed our p90 origin latency and integration latency was high. investigation further we found it was webp operations. Benchmarking using the command line tool for vips, we found significant latency differences for this value.

time vips resize /tmp/photo.jpeg /tmp/photo.webp [Q=75,reduction-effort=0,strip] "1.1"

real 0m0.193s
user 0m0.310s
sys 0m0.049s

time vips resize /tmp/photo.jpeg /tmp/photo.webp [Q=75,reduction-effort=4,strip] "1.1"

real 0m0.479s
user 0m0.551s
sys 0m0.056s

resultant file sizes
242,758 for reduction-effort=4 (default behaviour)
275,994 for reduction-effort=0

I understand CF will cache this so it's a one-time cost, but we would still like to be able to chose our preferred value

@G-Lenz
Copy link
Contributor

G-Lenz commented Jun 28, 2021

Hi @frankenbubble thank you for your request. We will add this to our backlog for further investigation.

@fisenkodv
Copy link
Contributor

@frankenbubble, we have updated our solution, the issue has been fixed, please see the change here. If you still see the issue with the latest version (v6.0.0), please feel free to reopen the issue.

You can refer to the recent changes here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants