-
Task
-
Resolution: Fixed
-
Should have
-
None
-
8
-
Performance
-
Sprint #146
Follow up from --PLANET-5142
When an image is uploaded currently the resizing is handled by the PHP server, which then uploads it to Google cloud storage. This resizing does not perform optimization (or if it does it's very ineffective). So the resulting images have a much large file size than a properly optimized version of them. This significantly increases the bandwidth consumed when visiting the site, and hurts our SEO as frontend performance is a major factor in ranking.
Since the Cloudflare image optimization works with a public url, it should be very easy to make the PHP server request an optimized version of an image after it is uploaded. This could be done on the original image, so we have cloudflare do the resize. But maybe it's more robust if we first let PHP do the resizing as it's doing now, and then just before uploading to google cloud storage request an optimized version of the final image and send that to cloud storage instead.
tasks
- Investigate if we can override Imagick so that it resizes images by calling cloudflare instead of doing it itself, or add a step on the final image to first request optimized version from CloudFlare before sending to cloud storage. We already have a class which extends the Imagick class, which can probably be used. https://github.com/greenpeace/planet4-master-theme/blob/ba92fc46f7b1224768b4f450d71fed9846b391e6/src/ImageCompression.php#L17-L17
- Cloudflare's image optimization also can serve different format based on the user agent (e.g. WepP if supported by browser). Investigate if we can request all possible formats and do client capability detection ourselves to serve the most appropriate image. This probably requires serving different urls for each format, which may be a blocker.
- Investigate how we can optimize images that are already unoptimized in cloud storage.