What do all of these websites have in common? They bring ungodly amounts of traffic to the (un)lucky patron who reaches the frontpage – so much that it brings a good portion of hosts to their knee’s for the sudden surge in traffic. Namely shared hosting providers are affected the most by this since their are numerous websites on the same server – meaning that there is *typically* less room for excessive resource usage.
Well, about a week ago one of our customers got on the frontpage of Digg, Reddit, and had a huge amount of popularity on other social networks such as Mixx and Stumbleupon (granted this isn’t the first time one of our customers has been “Dugg” – it was the first time we’ve seen such a large influx of traffic in such a short period of time). After seeing Apache get a little moody in the evening we took a peak to see why.. after a little bit of detective work we saw a huge amount of referrers from Digg, Reddit and crew.
After watching the server for a good 10-20~ minutes during the surge of traffic we were pleasantly greated with a < 1.00 load average through the whole debacle – the WordPress blog that was serving the content didn’t even hiccup (so for those of you who say 25 simeltaneous MySQL connections isn’t sufficient, well neener).
Even though in this instance no action was needed on anyones part – everything was being served properly, no slow downs, the server just burped and continues on spitting out webpages, we did prepare a simple static cache of the page that was receiving the most amount of traffic just in case the clients website was causing a issue / slowing down – it was a simple “.htaccess” rule that would see if the referrering URL was the Digg URL – if so it would just point to a static HTML file instead of making PHP/MySQL serve up the page over and over. Though we didn’t need to use it – it prompted me to post this blog post with some tips if you’re receiving a high amount of traffic on ANY shared provider .. there is a chance that you’ll use a sufficient amount of resources and be suspended for affecting other customers experience (better play it safe eh?).
- If the webpage that’s being pounded with requests is just a static file / content (in this case it was just a simple image) – simply save the HTML portion, pop it into a file and upload it to your root directory. Once you do that setup a simple .htaccess rule to point all traffic requesting that given page to the static file. Quick, yet fairly effective.
- If you’re using WordPress INSTALL A CACHING PLUGIN. Yes this will save you hours of grief if your website decides to get all popular in a hurry. Check out WordPress Super Cache or WP-Cache 2.0 (for HawkHost customers please feel free to send us a ticket and we’ll help you get this setup, we’re just cool like that).
- If you’re the monkey who decided it would be a good idea to submit your website (or article) to a gajillion social media websites and EXPECT it to be big – maybe think about linking to a Google Cache copy or use Coral Cache. This is a simple yet VERY effective way to prevent your website from getting a bit sluggish.
- Though these are just simple ways to prevent a huge amount of traffic – if you see your website is causing issues with any host you’re at on a steady basis – it may be time for you to check out a Virtual Private Server or a Dedicated Server .. this will allow you to have a guaranteed amount of resources to use and abuse to your liking – though keep in mind when going this route they’re *typically* not managed.. so you have to do the grunt work.
This was just touching the surface of simple ways to “cache” your website – look for a post in the near future explaining these methods further and some other ways to optimize your website.
Just for kicks here’s the bandwidth graph’s of when the clients side got bombarded from the numerous sites.. needless to say it kind of “stuck out” a bit more than normal ;).