I abandoned OpenLiteSpeed and went back to good ol’ Nginx - fivenewscrypto
Terkini Populer Kategori
Headline
Loading...

vendredi 26 janvier 2024

I abandoned OpenLiteSpeed and went back to good ol’ Nginx

I abandoned OpenLiteSpeed and went back to good ol’ Nginx
vendredi 26 janvier 2024
Ish is on fire, yo.

Enlarge / Ish is on fire, yo. (credit: Tim Macpherson / Getty Images)

Since 2017, in what spare time I have (ha!), I help my colleague Eric Berger host his Houston-area weather forecasting site, Space City Weather. It’s an interesting hosting challenge—on a typical day, SCW does maybe 20,000–30,000 page views to 10,000–15,000 unique visitors, which is a relatively easy load to handle with minimal work. But when severe weather events happen—especially in the summer, when hurricanes lurk in the Gulf of Mexico—the site’s traffic can spike to more than a million page views in 12 hours. That level of traffic requires a bit more prep to handle.

Hey, it's <a href="https://spacecityweather.com">Space City Weather</a>!

Hey, it's Space City Weather! (credit: Lee Hutchinson)

For a very long time, I ran SCW on a backend stack made up of HAProxy for SSL termination, Varnish Cache for on-box caching, and Nginx for the actual web server application—all fronted by Cloudflare to absorb the majority of the load. (I wrote about this setup at length on Ars a few years ago for folks who want some more in-depth details.) This stack was fully battle-tested and ready to devour whatever traffic we threw at it, but it was also annoyingly complex, with multiple cache layers to contend with, and that complexity made troubleshooting issues more difficult than I would have liked.

So during some winter downtime two years ago, I took the opportunity to jettison some complexity and reduce the hosting stack down to a single monolithic web server application: OpenLiteSpeed.

Read 32 remaining paragraphs | Comments


Share with your friends

Add your opinion
Disqus comments

Ads Auto