I want to provide a smooth experience to my site visitors, so I work on accessibility and ensure it works without JavaScript enabled. I care about page load time because some pages contain large illustrations, so I minify my HTML.
But one thing makes turning my blog light as a feather a pain in the ass.
The hurdleSee, a major win in traffic reduction (and thus latency savings on mobile!) comes not from minification but from compression. HTTP supports gzip and Brotli via the Content-Encoding
header. This is opt-in because compression takes resources, so transferring uncompressed data might be faster.
Typically, Brotli is better than gzip, and gzip is better than nothing. gzip is so cheap everyone enables it by default, but Brotli is way slower.
Annoyingly, I host my blog on GitHub pages, which doesn’t support Brotli. So Recovering garbled Bitcoin addresses, the longest post on my site, takes 92 KiB
instead of 37 KiB
. This amounts to an unnecessary 2.5x
increase in load time.
A naive ideaThere’s no reason why GitHub can’t support Brotli. Even if compressing files in-flight is slow, GitHub could still allow repo owners to upload pre-compressed data and use that.
GitHub doesn’t do that for us, but we can still take advantage of precompressed data. We’ll just have to manually decompress it in JavaScript on the client side.
Like a good developer, the first thing I do upon finding a problem is search for a solution on Google. brotli-dec-wasm turned up after a quick search, providing a 200 KB
Brotli decompressor in WASM. tiny-brotli-dec-wasm is even smaller, at 71 KiB
.
Alright, so we’re dealing with 92 KiB
for gzip vs 37 + 71 KiB
for Brotli. Umm…